Press "Enter" to skip to content

Category: 苏州美睫美甲

China: Better at censoring blogs than malware

Security research firm Sophos has just released new malware statistics. For the first time, China tops the list of countries hosting malware-infected web sites. That honor has traditionally gone to the US, which was the leader in 2006, but China has now pulled ahead with 35.6 percent of all infected sites. HangZhou Night Net

As Internet activity and economic growth skyrockets in China, the country is quickly becoming a malware powerhouse. An earlier Sophos report claimed that 30 percent of all malware written in 2006 originated from China, with 17 percent of that 30 percent devoted to stealing the passwords of online gamers. This stands in contrast to Brazil, where malware authors were mostly interested in gathering online banking information rather than games logins.

The US comes in just behind China, hosting 32.3 percent of all malware-infected sites, and Germany is a distant third with 7.5 percent. Russia, which hosted nearly 10 percent of all such sites in 2006, now hosts less than 5 percent of them, a drop of 50 percent in only a few months. This does not necessarily mean that Russia has suddenly grown more law-abiding, though, only that malware is skyrocketing.

Not that all malware is bad: Sophos notes that a 20-year old German man turned himself in to federal authorities last year after receiving a virus-encumbered message that claimed to be from German authorities. The man, who had child porn on his PC, promptly gave himself up, believing that investigators were on to him.

Sophos expects malware to enjoy healthy growth in 2007, saying that "we expect to see even more devious attempts to steal information with the end goal of financial gain."

But even items not designed for financial gain—like chain letters and e-mail hoaxes—are still going strong. The most popular (by several orders of magnitude) remains the Hotmail hoax, which warns users that their accounts could be terminated unless the e-mail is forwarded to 10 other Hotmail users. The fact that such a transparent ruse can achieve so much success is a reminder that general technical savvy on the Internet remains low.

Even more bizarre is the second-most popular hoax, the "Olympic Torch" e-mail that warns readers about a new virus. "It is a virus that opens an Olympic Torch which 'burns' the whole hard disc C of your computer," says the note.

The only upside to the report is that, should China or the US ever enter a conflict in which they need to call on the resources of thousands of Internet crooks, each country apparently has an inexhaustible supply of field-tested talent.

Science and the public purse

A couple of weeks ago, I wrote about the good news for UK scientists in the annual budget, who are seeing their funding raised by more than 20 percent over the next few years. At the time, I was broadly supportive of that announcement, but the emphasis on applied and translational research gave me pause. HangZhou Night Net

I had the fortune to hear Dr. Elias Zerhouni, head of the NIH, speak this weekend, and if anything it has further strengthened my misgivings over Gordon Brown's plans. One point that his speech drove home was on the NIH's determination to continue funding basic science as a priority over applied or clinical research, with good reason. Although the various institutes that make up the NIH have a combined research budget of around $28 billion, the pharmaceutical and biotech industries spend roughly twice that amount each year on research, but with very little spent on basic science.

Leaving out the argument that sometimes comes up—suggesting that federal funding of basic research amounts to a subsidy for industry—it seems clear to me that it's perfectly right for governments to pay for research that is both important and noncommercial. Expecting it to turn a profit as well is, in my opinion, asking academia to put extra constraints on what it does well, and trying to compete with companies that do it better, because that's all they do. Asking scientists to apply for grants based on applied research with links to industry means asking them to understand the market for commercial products in addition to their science, an unreasonable request.

Public funding of basic research is even more important when it comes to other fields of research, such as those funded by the NSF in the US, and the other research councils in the UK, but here there's an added problem, one that my colleague Dr. Timmer touched on yesterday. If the public can't see that they are getting their money's worth or don't understand why it's important to study certain topics, then the funding is quick to dry up.

Here at Nobel Intent we like to think that we're doing our own tiny part to improve the dissemination of exciting and interesting research, but for the academy as a whole, there's a long way to go yet.

Apple Store seems to hint at Macs with quad-core CPUs

Update: as noted in the comments, looks like I was right. It now says (italics are mine): HangZhou Night Net

Every new Mac features powerful dual-core or quad-core Intel processing, the world's most advanced operating system, and more.

End update.

Ever-diligent AppleInsider noticed that the Apple Store page for Adobe's Creative Suite 3 says:

Every new Mac features powerful dual-core or quad-core Intel processors, the world's most advanced operating system, and more. Build your Mac to your exact specifications, or start with our recommended configurations that are optimized for Creative Suite 3.

Could this mean that Apple will be releasing Macs with quad-core CPUs soon? That's one explanation. Another would be that the distinction between a quad-core Mac (with two dual-core CPUs) and a Mac with one or two quad-core CPUs is lost on the Apple Store copy writer. Or Apple may want to sell a Mac with a single four core CPU rather than dual dual-cores like the current Mac Pro.

Many people seem to think the move to octo-core is a no-brainer. I'm sure there will be a Mac with eight cores in it at some point in the future, but adding cores is far from a magical panacea that makes performance effortlessly go through the roof. There are two reasons for this: in order to keep the additional cores filled with instructions and data, ideally the bus and memory speeds should go up along with the number of cores. Better caching and software tricks can help here, but at some point, a task or set of tasks will be bus-bound rather than CPU-bound and extra cores won't add more performance.

The other issue is the software. There are two ways to gain performance from extra cores: by running more stuff simultaneously, or by splitting one task up into sub-tasks that can run on different CPU cores in parallel. The former is what happens on servers, where each core can tend to a different request. On desktops, the move from one to two cores made multitasking a more pleasant experience, but few people run so many applications at the same time that four or even eight cores increases performance for software that isn't multi-core aware. Fortunately, a lot of (Apple) software is multi-core aware these days, so encoding video, rendering image effects and the like goes faster as the number of cores goes up. However, the application writers are going to find it harder and harder to split the work their applications have to d/ into smaller and smaller pieces, and the overhead of managing the parallel operation will only increase.

Bottom line: in a few years, a "core myth" could be filling the shoes of the only recently abandoned gigahertz myth—for now, I'll be salivating over the prospect of imminent octos like everyone else, though.

Vonage hangs up on Verizon patent infringement with new agreement

Vonage has signed an agreement with a VoIP network services provider to carry calls placed by Vonage customers, giving the troubled VoIP provider an out on two of the three Verizon patents it was found to have infringed. According to a Form 8-K filing with the Securities and Exchange Commission, Vonage and VoIP, Inc. have inked a two-year contract under which VoIP, Inc.—likely under its VOICEONE brand—will provide network services for Vonage customers. HangZhou Night Net

Last month, a federal jury found that Vonage's VoIP services infringed on three patents owned by Verizon after deliberating for less than a day. Two of the patents cover connecting VoIP calls to public switched telephone networks (PSTN); the third covers VoIP calls made using WiFi phones. While the jury found that Vonage did not knowingly infringe on Verizon's patents, it did award the telecom $58 million in damages.

When the federal judge overseeing the case issued an injunction against Vonage a couple of weeks later, concerns about the company's viability increased. The judge will rule in the next couple of weeks on whether to enforce the injunction immediately or allow the case to make its way through what could be a lengthy appeals process.

By signing the agreement with VoIP, Inc., Vonage has provided itself with a measure of protection against the injunction. VoIP, Inc. owns its own network, describing VOICEONE as the "first, seamless nationwide IP network." Perhaps most crucially from Vonage's standpoint, VoIP, Inc. claims to own the intellectual property around its network and services.

After the two-year agreement has run its course, the companies have the option of continuing it on a month-to-month basis.

All of this comes at a price to Vonage. With the threat of a permanent injunction hanging over its head, the company was not in the strongest of bargaining positions. The agreement has to be a bitter pill to swallow for a company that is still experiencing high levels of customer churn and has yet to make a profit. The terms of the deal have not been announced, but whatever the terms may be, the agreement represents an additional, ongoing expense for Vonage.

Despite the financial concerns, it was a necessary move for Vonage. The agreement all but kills the threat of a shutdown of Vonage's network, giving both the company and its customers (me, for one) some peace of mind.

Update

After the story ran, Ars was contacted by a Vonage spokesperson that claimed that the agreement with VoIP, Inc. has "nothing to do with the patent situation." She described the deal as another termination deal similar to those Vonage has signed with other carriers, reiterating that the agreement was unrelated to the Verizon agreement. However, an unnamed source at VoIP, Inc. suggested to TelecomWeb that Vonage would indeed be using its network to carry its calls, while refusing to speculate about the patent dustup.

Joost 0.9 for Mac “biggest release yet”

By now, many of you have been able to score invitations from various friends and strangers to try out Joost on your Macs, the P2P video service once known as The Venice Project. Well, the Joost team has pushed out another version today, bringing the software to version 0.9 and the "biggest release yet." Considering that it's the second release for the Mac, that's… pretty impressive, I guess? HangZhou Night Net

What can be found in Joost 0.9 that wasn't there in Joost 0.8? The Joost team has now redone the registration and login process for when you first start up the program, as well as moved the invitation system to the program itself—not just the web site anymore. The "invitation" widget is located in the "My Joost" area, if you're just dying to check it out. The Joost team has also added a standby mode to the program, which pauses all video and minimizes the program to your taskbar. This sounds like the perfect emergency strategy for those times when you're watching Joost on your computer at work and the boss comes by.

Other improvements to the software include how Joost handles poor network performance. In an attempt to make your viewing experience better, Joost "will attempt to retry and restart streams that have become stalled." They've also done a little bit of spring cleaning to the user interface and the addition of a dialog box when you enter into window mode—they're well aware of the text squishing problem, and still appear to be perplexed by it.

On top of all of that, Joost has added the ability for shows to have "overlays," which is extra content that may come with various shows, and a "considerably improved" channel catalog. Some less exciting updates include the addition of an age warning, rewind and forward seeking additions, and the addition of an RSS reader (of all things) into the software.

All in all, it is indeed quite a hefty update even if most of the added functionality don't change too much about how the user views the content. The file is 17MB from Joost's website and you must still have an invite in order to participate in the beta. Oh, and please, don't start sending me e-mails asking for invites this time!

EU investigating Apple, Big Four labels over country-specific pricing

In the wake of Apple's landmark deal with EMI to sell DRM-free music on the iTunes Store, the EU has confirmed that Apple and the Big Four record labels are now the subject of an antitrust investigation by the European Commission. Late last week, the EC reportedly sent a confidential statement of objections to the iPod maker and record labels EMI, Sony BMG, Universal, and Warner, charging them with violating EU "territorial sales restrictions." HangZhou Night Net

At issue are the different, country-specific storefronts used by Apple's iTunes Store. Most EU residents can buy DRMed tracks for €0.99 each (€1.29 for DRM-free music). UK residents, however, pay £0.79 per DRMed song (or £0.99 for DRM-free music), a roughly €0.17 difference.

Complaints over the pricing inequity arose shortly after the European launch of the iTunes Store in the summer of 2004. In December of that year, the UK Office of Fair Trading referred the pricing complaints to the EC, and the current action stems from that referral.

The EC believes that the separate pricing structure goes against EU competition rules, and that Apple should offer music at a single price across all EU countries. Apple says that it agrees with the EU's desire for uniform pricing, but lays the blame for the variable pricing at the feet of the record companies. "Apple has always wanted to operate a single, pan-European iTunes store, accessible by anyone from any member state," a company spokesperson told the Financial Times. "But we were advised by the music labels and publishers that there were certain legal limits to the rights they could grant us."

Apple's business practices in Europe have been closely scrutinized as of late. Consumer groups and a handful of governments across Europe have criticized the tie-in between the iPod and the iTunes Store, although the EU has more recently backed down from some tough talk about the situation. Today's deal with EMI may ameliorate some of those concerns, and the Norwegian Consumer Council has signaled its approval of the move. The EU's press release makes it clear that this new investigation is not about DRM: "The Statement of Objections does not allege that Apple is in a dominant market position and is not about Apple's use of its proprietary Digital Rights Management (DRM) to control usage rights for downloads from the iTunes on-line store."

Given the licensing challenges Apple faced in rolling out the iTunes Store due to the maze of licensing agreements that needed to be negotiated—the store launched first in the UK, France, and Germany in June 2004, with nine more EU countries added in October of that year, and Norway, Sweden, Denmark, and Switzerland in May 2005—the "record labels made us do it" defense is not outside the realm of possibility. Apple says that it believes it has not violated EU law and will "work with the EU to resolve this matter," according to the same spokesperson.

Going ballistic with electrons

Given the trend in solid-state electronics, where the feature size is expected to shrink every couple of years (Moore's Law), single-molecule transistors (or switches) are certain to be a part of your CPU in the not-so-distant future. In anticipation of this, researchers have been studying single-molecule transistors and switches for a while. However, there are several issues that stand in the way of implementing molecular switches. Mostly, the problems boil down to understanding how electrons flow through molecules. HangZhou Night Net

To understand this barrier a little better, let's take a quick peek at how electrons flow through a metal (or semiconductor). In these materials, the atoms are arranged in a regular spatial pattern, so the electrons whose movements match that spatial pattern travel more efficiently than those who do not. By match, I mean that electrons with a certain speed, traveling in a certain direction, will move without interruption in a process called ballistic transport. Those electrons with different velocities will collide* with the electrons remaining around the atoms. These electrons lose energy, change direction, and only drift slowly in the direction given by the applied voltage. Normally we don't observe this difference because metallic conductors have a very poor crystalline structure over long distances so no electrons are ballistic for very long. Nevertheless, it is easy to see that ballistic transport is much faster and generates much less heat than non-ballistic transport. It is also clear that single molecule devices will almost certainly need to work with ballistic electrons, otherwise the absorbed energy will eventually destroy them.

To better understand the transport through single molecules, researchers in Germany have modified a form of electron microscopy. In two separate experiments, C60 fullerenes and an organic molecule (3,4,9,10-perylene-tetracarboxylic acid dianhydride, for the chemists hidden amongst us) were evaporated onto an atomically flat, two-atom-thick layer of bismuth that was itself on a silicon substrate. Bismuth and silicon are well matched in their physical and electronic properties, which maximizes the number of ballistic electrons—provided the applied voltage is correct and the electrons are going in the right direction. Non-ballistic electrons lose energy and most of them are trapped in the metallic layer, therefore measuring the current through the silicon amounts to measuring the ballistic transport properties of the molecule-metal-silicon system. Using the tip from a scanning tunneling electron microscope, the researchers were able to direct electrons to specific positions on the surface molecules—essentially probing where injected electrons can match the electronic properties of the molecule and ballistically transfer through to the silicon.

They found that transport through the fullerenes occurs along the carbon-carbon bonds around the surface of the molecule, rather than tunneling straight through the center. Although modeling had predicted this, it was the first experimental evidence that the model was correct. The organic molecule was something of a surprise. The molecule is a flat layer, consisting of a series of interconnected rings with oxygen end groups at each corner. These molecules lay themselves out in a herringbone pattern on the surface of the metal. From this, one might expect that the current should be pretty even over the whole surface. However, ballistic transport is more efficient at the end groups, where the oxygen molecules bend the molecule down towards the underlying metallic layer.

By themselves, the experimental results are that significant finding. The significance, and the reason the work was published in Science, lies in the development of a technique that allows researchers to understand the detailed electron transport properties through molecules and across contacts between molecules and bulk surfaces. Eventually, this should provide insight into many different properties for single molecule switches and wires, such as points of failure, self-assembly, and contact properties.

*By collide, I don't mean two electrons actually hiting each other, rather the field generated by the electrons surrounding the atom slow and change the direction of the drifting electron.

Google crawls off the web and into TV commercials

Even as BusinessWeek raised questions about the growing power of Googlezon, the fire-breathing search monster lurched into a new market: television. Google announced yesterday that it was launching a closed trial of a new television advertising service in conjunction with Echostar and Astound Cable. Unlike traditional TV advertising, Google will only charge for ads that people actually watch. HangZhou Night Net

The system will utilize set-top box data to track viewership of commercials down to the second-by-second level. "Advances in set-top-box technologies make it possible to report aggregate statistics on how many times an ad was viewed and whether it was watched through to the end," Google said in its announcement. "As part of this trial, we will be working with partners to use aggregate, anonymized set-top-box metrics to deliver timely and accurate viewing reports." The new metrics will give advertisers far more insight into particular ad campaigns than they currently have, and all of the buying decisions and reporting information will be available through the standard Google advertising interface.

The result is that Google will bill by CPM (cost per thousand) impressions of an ad, but these numbers will be actual viewership numbers, not aggregate totals from the entire show. That means that advertisers who air ads back-to-back might pay different rates depending on how many customers flipped the channel as the ads ran. The ads can be targeted by demographic, time of day, or particular channel.

Ads will be purchased using an auction system like the one that currently exists for AdWords. Google hopes that the simplicity of the system and the ability to do small, targeted ad buys will open television advertising to small business, much as Google's system did with the web. Advertisers involved in the initial trial won't be mom-and-pop stores, though; Advertising Age says that Intel and ETrade will be among the first users.

Google has been hinting at such an announcement for several months, and the initiative will join the company's other attempts to move beyond the web page. Google has experimented with print ads, radio ads, and video game ads, but the television project sounds more ambitious than past efforts. Giving advertisers a second-by-second breakdown of how many people watch their ads will bring an entirely new level of granularity to television advertising, but it's not going to happen without partnerships.

Until Google launches its own GoogleSat or turns into an ISP and begins pumping IPTV into consumer homes through all of that dark fiber it owns, the company will need to partner with existing networks and delivery systems in order to have any ad time to sell. Echostar has been willing to let Google resell some of the time that it controls, but other partners with a long history of controlling their own ad sales may balk. This could be especially true of major television networks like NBC or Viacom (owner of several networks), both of which have had conflict with Google over its subsidiary YouTube. Networks that are already wary of Google's online dominance may be hesitant to let the search giant have a piece of the off-line advertising market.

Study finds stable personalities unaffected by violent games

Those of you who have followed the literature examining potential connections between violent video games and real-world violence know that the evidence for such a connection is pretty tenuous. Studies purporting to show such a connection appear on a regular basis, often alternating with other studies that suggest that the connection is illusory. If it's any consolation, researchers in the field find the contradictory results just as confusing as you do, and some have called for efforts to be focused on understanding the reasons underlying the confusion. A paper that's in press at Psychology, Crime & Law claims to have accomplished just that. HangZhou Night Net

The authors of the study note that the literature contains a combination of studies that show a connection between aggression and violent games, others that showed no such connection, and a few studies showing that gaming reduced aggression. They claim that their study is unique in that it considers the possibility that these represent three distinct responses to gaming, and suggest that prior studies may have produced conflicting results by trying to shoehorn these into a binary classification.

They designed a study in which measures of anger levels acted as a proxy for violent behavior. They recruited 135 children, but were forced to kick some out of the study due to bad behavior, leaving them with about 110 boys and 15 girls with a mean age of 14.6 years, all of them familiar with the game of choice, Quake II. The children were given personality profile tests and measured for anger levels, at which point they were set loose for 20 minutes of gaming. Anger levels were measured again following the gaming session.

Crunching the numbers indicated that there were three clear groups. The anger levels of 77 of the subjects remained unchanged after the gaming session. In 22 of the subjects, anger levels nearly doubled from a starting point similar to that of the unaffected children. But 8 of the test subjects started out at this high anger level; for them, 20 minutes of gaming dropped them down to levels similar to those seen in the unaffected group.

The research team then correlated these groups with the personality profiles, and an clear pattern emerged. Those with personalities that were scored as stable largely wound up in the unaffected group, while the remaining two groups were populated by personalities that were considered less stable.

The authors propose that gamers fall into two groups: stable personalities, and those with emotional states that are susceptible to being influenced by game play. Within the latter group, the response to violent games largely depends on the emotional states of the gamers when they begin play. Angry gamers will cool off, calm gamers will get agitated. They also note that only two of the cases of rising anger reached levels that would be considered cause for concern, suggesting that dangerous levels of anger were rarely triggered by gaming.

The authors made it clear that their study should not be viewed as the final word on the matter. The link between anger and aggression is far from clear, and they would like to see similar results reproduced with other test groups and using different games and experimental setups. It's also worth noting that they attempted to measure a wide range of additional factors during their study, but many of these measurements produced statistically insignificant or contradictory results. Nevertheless, the study appears to be significant in that it is the first I've seen that attempts to move beyond adding to the large body of confusing results that already exists, and instead tries to identify the reason that it's so easy to produce contradictory findings in the first place.

Guitar Hero 2 for the 360 is here; a few thoughts inside

"So you play the game with a plastic guitar?" HangZhou Night Net

"Yes."

"Is this for your kids or something?"

"No."

I'm trying to keep my cool, but dealing with the register monkey is driving me crazy. Just give me Guitar Hero 2! The gentleman finally decides to shut up and sell it to me, then looks annoyed when I want a second guitar. But nothing can dampen my enthusiasm as I bound home with my $150 in purchases. Guitar Hero on the 360 is not an inexpensive game to get into, that's for sure. Also,it may take some running. In my area, I had to visit three stores before I found both the game and an extra guitar. Your mileage, as always, may vary.

I'm going to save the in-depth thoughts for the big review, but a few things stood out on my first few songs. First, I'm not doing very well, butevery time I get a new guitar it takes me a night or two to break it in and get comfortable with it. The buttons seem closer together on the 360 guitar controller than they do on the PS2 original, but overall it feels solid. While the cable looks long, I found that it's still a little short for my taste. I leaned back into a solo and almost toppled my 360. A USB extension cable may be in order here.

We also have another game that supports 1080p on the 360, but Guitar Hero was never meant to be a graphical powerhouse. Still, it's nice to know the option is there if your display supports it. We also know from the game directionsthat the jack on the guitar will be used for upcoming effects pedals, which I look forward to checking out. I'm not so sure about the mobile version of the game confirmed by the press release sent out two days ago, though. I don't see playing Guitar Hero on a phone and having a good time. I'm open to being surprised.

I haven't gotten far, but the master track of "Possum Kingdom" sounds great and is an excellent choice as a new song. I'm looking forward to the leaderboards to see how I stack up against other players, whichmay be a good incentive to up your score again and again, and should be a good way tobuild in longevity to the title.

Who else picked it up?