Press "Enter" to skip to content

杭州夜生活,杭州楼凤,杭州夜网论坛 Posts

Featured

An office app usability rant leading up to an iWork tip

A few days ago, I was listening to episode 36 of the MacBreak Weekly podcast. After a very good discussion of the usefulness of 256kbps audio encoding, the panel sunk its teeth into the pros and cons of Microsoft Office's new ribbon user interface. The idea behind the ribbon is that icons on the screen represent various actions, and as you click on the ribbon, more related icons/actions are revealed. So at any one time, only a subset of the possible actions is visible. This is supposed to be easier to work with than the standard menus and toolbars that we're used to. I'm highly skeptical, because I've always disliked the way that toolbars take up half the screen in Microsoft Office, especially under Windows, and I have no idea what most of the icons are supposed to mean anyway. HangZhou Night Net

But I'm reserving final judgment until I get the chance to work with the ribbon for a bit. I can't blame Microsoft for trying something new, though. When writing some text in the new NeoOffice, I was plagued by the little squiggly red lines under many words. Usually, I turn spellcheck-as-you-type off; after all, I am a Published Author and no computer—not even a Mac—is going to tell me how I can and can't spell words in the English language. That's what editors are for.

However, in this case, NeoOffice didn't just flag names and unusual words as spelled incorrectly, but also a lot of very common words. It's entirely possible that I don't spell as well as I think I do, but I'm pretty sure I know how to spell "and." Could it be that NeoOffice was using a different language than English to spell check my document? But where on Earth do I get to set the language for my text? It took me several minutes to find out that it's under the "character" menu. The logic behind this is probably that you may have a word or a sentence in a different language than the rest of a paragraph or the rest of the document, so it can't be a document or paragraph setting. But I'm pretty sure many people aren't going to look under the character menu when their spell checking is out of whack. Interestingly, in the old version of Word for Mac that resides on my system, this setting is easily found under the tools/language menu.

Apple's take, on the other hand, is slightly different: they use an inspector. For those of you unfamiliar with inspectors: they're little windows containing various settings that you can bring up and close as required. Inspectors are used in iWork, along with separate inspector-like windows for fonts and colors. After getting used to this system, I always found it to work well. The different inspector modes are accessible using icons—but only a few of them, so it's humanly possible to remember their function—there's no artificial difference between paragraph and character settings: everything is simply found under the big T for text.

There's just one little thing that always bugged me about the inspector: often, it's necessary to switch between different inspector modes frequently, which can get annoying. Turns out that as of Keynote version 1, which I've had since 2003, you can bring up multiple inspectors (under the "view" menu), so you can have immediate access to two or more different modes without the need to switch.

Featured

Diplomats force IPCC to water down report on climate change

More climate-change politics this morning, I'm afraid. As you might be aware, the Intergovernmental Panel on Climate Change (IPCC), the multinational UN organization that is tasked with the problem of climate change, is in the process of releasing its fourth assessment report on the "global present state of knowledge on climate change." HangZhou Night Net

The IPCC has three working groups, that deal with "The Physical Science Basis," "Impacts, Adaptation and Vulnerability," and "Mitigation of Climate Change." These three groups can be summed up as "How is it happening," "What happens when it happens," and "How the hell do we stop it or deal with it?" Working Group I has already released its outline, and there are draft copies of their part of the report that have leaked onto the internet. Working Group III meets at the end of this month in Bangkok, and Working Group II, who have been meeting in Brussels, released their summary this morning. This summary is intended to distill the contents of the 1,500-page scientific report down to the point where it can be easily read and understood by policymakers.

But the release of that summary has not been without incident. Although the scientists behind the document were happy with their effort, they encountered fierce diplomatic pressure from a number of countries to tone down their language. The problem arises from the use of common language to describe scientific certainty. If one were speaking to another scientist, then they might describe certainty of outcome as a percentage; a 90 percent certainty, for example, or a 99 percent certainty. As the IPCC summaries are meant for politicians, very few of whom appear to have anything more than rudimentary scientific knowledge, these percentages are translated into plainer English.

The heart of the problem has been the successful efforts by delegates from China and Saudi Arabia to change language describing how many natural ecosystems around the world are already being affected. Originally, it was reported that there was "a very high confidence" that areas around the globe "are being affected by regional climate changes, particularly temperature increases." "A very high confidence" translates as a 90 percent certainty, but under political pressure, this was downgraded to "a high certainty," meaning only 80 percent. Other parts of the report were also watered down, causing outrage amongst the scientists who authored the report.

It would be naive of me to expect that such a thing would not or could not happen, but I can't get away from the feeling that this is more than a little shortsighted on the parts of those nations that are downplaying the problems we face. Editing a word or graph out of the report is not going to stop the Himalayan glaciers from melting, leaving China with a freshwater shortage. It's not going to stop the northward spread of tropical diseases into Europe, and it's not stopping the Gulf region of the US from being battered by tropical storms of increasing intensity. You can lie to yourself that your shoes are on fire all you like, but when the flames start licking at your navel, did it really matter?

Featured

Bill may require call center employees to disclose location

"Hello, my name is John, and I'm speaking to you from Bangalore, India, today. How can I help you?" That's a phrase that we may start hearing when we make calls to customer service centers, if a recently-proposed bill by House Representative Jason Altmire (D-PA) goes into effect. The bill, HR 1776, is titled "Call Center Consumers Right to Know Act" and would require call center employees to state their physical location when a customer calls in. HangZhou Night Net

As the title indicates, the bill is designed to make customers aware of the widespread nature of call-center outsourcing. Once discovering that a large majority of their calls are being redirected overseas, customers would theoretically be more willing to take action and let companies know how they feel about the hot issue of call center outsourcing that is often blamed for a portion of lost jobs in the US.

The bill's introduction undoubtedly comes from good intentions, but seem like a roundabout way of addressing an issue that is clearly important to certain members of Congress. Many Americans are already painfully aware that their calls are being directed overseas, and such a requirement would only confirm this knowledge. However, the bill might encourage customer pressure on companies to change their outsourcing ways or risk losing business. But it's not clear how many Americans would actually use this knowledge to alter their own buying habits—how many consumers say they'd like to support American clothing businesses but completely ignore "Made in [Country]" tags when it comes down to saving money?

Similar bills were introduced by Senator John Kerry (D-MA) in 2003 and again in 2004 by Representative Ted Strickland (D-OH), but both stagnated in Congress. Altmire's bill is still in the very early stages of the process and has not yet been scheduled for debate in the House.

Google adds personalized mapmaking to Google Maps

In an effort to add more social sharing to Google Maps, Google announced today the availability of My Maps, an extension of its web-based mapping tool. My Maps allows any user to create a personalized map—tied to his or her Gmail login—that can contain a variety of information, such as a path mapped out of a walk around a city or a road trip across the country, a photo montage of a trip, text describing what happened at particular locations, or even embedded video of various landmarks. My Maps is accessible by going to Google Maps and then clicking on the "My Maps" tab next to "Search Results" in the left column. HangZhou Night Net

Creating a personalized map and adding things to it are simple—just drag and drop lines, shapes, or placemarkers onto the map. A few maps have already been created to demonstrate what kinds of things can be done: there's a map of Olympic host cities with links to Wikipedia for more information, a map of the Googleplex complete with pictures for each landmark, and a map of someone's trip across Japan. Even I got into the fun of creating a map of random pictures I had taken around the city of Chicago (along with a path or two for walks I'd taken). More pictures will likely be added throughout the day—it's addicting.

Clicking the blue marker will show a picture
I placed on My Map of the restaurant

Each map gets a unique URL for sharing with family and friends, but users can also choose to have their maps published publicly for inclusion alongside regular Google Maps results. This means that whenever someone searches for Fogo de Chao in Chicago, they'll see the red Google Map result for the restaurant as well as the blue marker for the picture that I planted there on my own map.

The functionality is somewhat similar to Flickr's mapping tool, which allows users to drag and drop photos from their Flickr feeds directly onto a map. But unlike Flickr, Google's My Maps allows for the inclusions of other types of media as well as path-mapping and other functionality. Adding pictures is not as simple as Flickr's solution, however, and an average layman may have to learn some very basic HTML in order to include photo/media links into his My Maps landmarks. A solution for people who are not quite as computer-savvy as some of us would be to offer a box that explicitly asks for the URL to a photo in addition to the HTML box. Regardless, My Maps looks like a fun tool to tell media-rich travel stories to friends, family, and even strangers.

Yahoo prepares to take on Google with new search interface

Yahoo! is currently beta testing a new search engine interface that it calls Alpha, which promises to bring advanced customization features to Internet searching. Version 1.0 of the product is scheduled for a public release soon, which will remove the amusing "Alpha (beta)" moniker from the site's main page. HangZhou Night Net

Like other search aggregators before it—anyone remember Copernic?—Alpha takes the results of many different search engines and places the results for each search on a single page. Unlike Google's Custom Web Search, Yahoo! Alpha allows the user to fully customize the page, adding and changing the search panels to support any number of search engines, including competitors to Yahoo Search. The results are loaded using AJAX, so the entire page does not need to refresh when the results for a new search site are loaded. By default, results from Yahoo's own search engine appear on the main panel to the left, with other engines in dropdown boxes to the right.

The Yahoo Search main page.

Customizing the main page requires a Yahoo account, which is free. Once signed in, the user can rearrange the main page using a simple drag-and-drop interface. New search engines can be added using the "Custom info source" page, and not just regular search engines can be added here: any web site that parses search results as an RSS feed can be added to the layout. In a nod to the "Web 2.0" motif of collaboration, users can select a "share this layout with users" box that will allow other Yahoo members to view and copy their search configuration.

It's clear from efforts like Alpha and the customized search box for web site owners that Yahoo is serious about competing with Google and believes the best way to do this is to keep coming up with new and different ideas to get people to use their search engine. The company will have to keep trying, as Yahoo seems unable to make any headway in getting out of its second-place ranking behind the industry leader, Google.

Blogosphere growth slowing considerably

The "blogosphere" continues to grow considerably, according to Technorati's quarterly "State of the Live Web" (previously known as "State of the Blogosphere") report, but the growth shows signs leveling off. As of the end of March 2007, the number of blogs tracked through Technorati topped 72 million. The number is up from 35 million tracked blogs a year ago and 8 million in 2005. Technorati says that its data shows that 120,000 new blogs are being created every day, worldwide. HangZhou Night Net

The rate at which the blogosphere continues to grow, however, has slowed considerably. While the growth rate of new blog creation previously showed a doubling of the numbers every six months or so, Technorati's Dave Sifry says that doubling from 35 million blogs to over 70 million blogs took roughly 320 days—just under a year. "This shouldn't be surprising, as we're dealing with the law of large numbers—it takes a lot more growth to double from 35 million blogs to 70 million (which took about 320 days) than when it doubled from 5 million to 10 million blogs (which took about 180 days)," he writes.

The number of new blogs isn't the only thing that's slowing down. The company reports that they've noticed a slowing in daily posting volume as well. Technorati was tracking about 1.5 million blog posts per day in March of this year, compared with 1.3 million posts per day from just a year ago.

The slowing growth numbers could indicate that the world is beginning to reach its max capacity of users interested in creating and contributing to blogs. Analyst firm Gartner predicted late last year that 2007 would mark the peak of the blogging trend after levels crossed the 100 million blog mark. The reason given for the prediction was that the majority of those who were interested in creating a blog already have, and everyone else has already moved on. However, that prediction spurred some disagreement among members of the blogosphere, many of whom pointed out that developing countries—especially large ones, like China—have yet to get most of their population online, and therefore there are still millions of potential bloggers waiting in the wings.

Rats use mental schemas to speed learning

How easy is it to make a memory? We've previously discussed how episodic memories form in the hippocampus, and gradually get translated into long-term memory in the cortex. Research that will appear later today in Science suggests that the process of consolidating these memories doesn't have to be gradual. Instead, if we have previous long-term memories that provide a framework for understanding the new information—a mental schema—then we can solidify new memories rapidly. HangZhou Night Net

The researchers worked with rats that were placed in an "arena" containing a number of sand-filled cups. Six types of food were used in the experiments, and each type was consistently hidden in the same cup (bacon was always in cup six, for example); only one type of food was present in the arena at a given time. As they were placed in the arena, the rats were given a small sample of the type of food that was hidden in that test. The researchers then determined whether the small taste was enough to allow the rats to go directly to the cup that contained more of that food.

Over the span of a month, the rats improved their accuracy so that whenever they were given a priming taste of a specific type of food, they zeroed in on the appropriate location at a rate well above random. This learning depended on the normal memory process, as rats with lesions in the hippocampus stayed at levels of accuracy consistent with random searching.

But the rats had apparently learned more than the simple taste/location association;: they had learned to understand the process. After a single example using a new food type was provided, the rats were able to go to the correct location at the same rate as their success with the original foods. By understanding the process via a mental schema, the rats were able to form new associations almost instantly.

How does this play out on the biological level? It turns out that once a schema is in place in the cortex, the process of transferring memories that fit the schema from the hippocampus to the cortex occurs rapidly. Lesions to the hippocampus within three hours of the first use of a new taste cue still blocked the process, but lesions made 24 hours after didn't, suggesting that less than a day is required to consolidate schema-based memories.

Distilled down to its most basic principles, the data is not much of a surprise: how an animal learns depends on what it already knows. But the authors suggest this plays out in more complex ways in humans using an example any scientist reading the paper would understand, that of comprehending a seminar. "Our ability to do so," they write, "depends as much on our possession of an appropriate mental schema as on the communicative skill of the speaker in logically conveying his or her message. In the absence of such mental frameworks, we are unable to follow scientific inferences in a talk and have the phenomenological experience of being functionally amnesic for its content a surprisingly short time later."

In other words, if we can't fit new information into the mental schemas we've developed over the years, we might as well not be listening, because we'll forget it all shortly anyway.

Global chip sales up in early 2007; 2006 was banner year for AMD

The Semiconductor Industry Association has crunched the latest chip sales numbers, and the results paint a picture of an industry in transition. On the one hand, February's semiconductor sales totals were down 6.5 percent from the previous month, with most of the downturn due to regular seasonal factors. Other factors were lower average selling prices and a drop in shipments of microprocessors and DSPs. HangZhou Night Net

But even though February sales were down compared to January, they were still up 4.2 percent versus February of last year. In fact, this 4.2 percent increase is where the most interesting part of the semiconductor sales story lies. A closer look at the SIA data shows that year-to-year semi sales in the Americas were down 8.6 percent, while sales in the Asia-Pacific region were up 10.1 percent (for an average increase of 4.2 percent). Yet again, we see Asia consuming a larger relative share of the global semiconductor pie, which is precisely why Intel is so keen on building a real fab presence in China.

In addition to the SIA's semiconductor sales numbers, Gartner has released a fresh round of rankings for 2006. First up is Gartner's semiconductor manufacturing equipment vendor rankings—this is basically a list of companies supply the equipment that enables semiconductor makers like Intel to design, test, and manufacture chips. I won't bore you with the details of the list, since most readers won't have heard of most of the companies on it, but it's worth noting that overall sales of chipmaking equipment were up 22.6 percent in 2006. This bump in chipmaking equipment means more fab capacity for 2007 and 2008, and more capacity should in turn mean more sales if demand holds up. Some of this increase in chipmaking equipment sales was driven by memory makers, who expect the market's appetite for solid state storage to continue to grow.

The other big Gartner list that came out today was the 2006 rankings for semiconductor sales. As always, Intel topped the list of semiconductor makers, with a market share of 11.6 percent. Memory firms made up much of the rest of the Top 10, with processor makers AMD and Freescale fill the bottom two slots at 2.8 and 2.3 percent market share, respectively.

The biggest story in these sales ranking numbers for 2006 was the rise of AMD. AMD post 86.1 percent revenue growth versus 2005, an increase that vaulted the company into the Top 10. The Gartner numbers agree closely with the iSuppli numbers that we reported earlier, numbers that also had Intel's revenues down 11 percent in 2006 (Gartner shows a 12 percent decline). Intel is on track to turn things around in 2007, so the big question for AMD is how much of their 2006 momentum they can sustain. This is why the company absolutely has to deliver the goods with Barcelona.

Microsoft sings a different tune: Zune likely to sell DRM-free tracks

Microsoft, until now a staunch defender of DRM, may make an about-face when it comes to its Zune digital music on-line store. The software giant will likely offer DRM-free tracks for the Zune in the near future, but no one knows when. With EMI releasing unprotected songs on iTunes, there is little question that Microsoft will be keeping a close eye on how popular the DRM-free (and higher fidelity) songs sell for the market leader. HangZhou Night Net

The fact that DRM-free music is now available from one major label opens a window of opportunity for Microsoft, since now downloads from Zune will work on the iPod and vice versa, preventing customers from being locked into a single store for their music downloads. One issue Microsoft will have to work out that Apple did not face: the Zune's ability to "squirt" music to other Zunes. The squirting has stringent restrictions—the shared song can only be played three times by the recipient

According to the latest numbers, the Zune is having a hard time gaining ground against other digital music players. Microsoft hopes this will change as it starts a massive advertisingcampaign in an attempt to push the Zune into the limelight. A flash-based player may help, as all Zunes are currently hard-drive based.

There's another hint about the Zune's future:

"People are responding so well to the colors," Reindorp said. "We're having a lot of fun playing and experimenting with them."

It is disappointing when a world leader in technology is using the colors of its products as the selling point instead of innovation. How about beefing up the WiFi capabilities and then giving us a pink Zune? If this is the best Microsoft has to offer, it's no surprise that it is having a rough time catching up to Apple. Redmond has a history of stealing market share from established players, but the iPod is a formidable competitor.

Azureus gives Zudeo HD video service a new moniker: Vuze

Don't call it "Zudeo"—the real, final, actual name of the new Azureus hi-def BitTorrent service is "Vuze." This is apparently a hipster spelling of "views," which is what Vuze hopes users will do with the new service. The company is counting on voracious consumer demand for high-definition content, and they don't believe that current streaming technology is good enough to deliver reliable HD video in real-time. HangZhou Night Net

Vuze requires the use of a modified Azureus application that downloads the clips in question using BitTorrent but also processes payments for protected content and searches for new content. Since launching the service as a beta late last year, Azureus has managed to reel in more than 2 million unique users a month, according to the company.

Those users come to find HD content from major networks and indie producers alike. While Vuze touts its openness to independent filmmakers, the service has also built up a nice network of established content partners. The BBC, Showtime, A&E, the History Channel, and National Geographic all have deals in place with Vuze to offer their content, but much of this is currently in standard definition only. For a site touting its HD video, it's unfortunate that so many of the big-name episodes are only DVD quality.

Depending on the content, clips can be purchased or rented. BBC TV shows appear to run $0.99 an episode, but prices vary by producer. Showtime, for instance, gives away single episodes of its drama in an effort to attract viewers.

Despite the name change and the addition of new navigation and search tools, the service remains in beta.

Microsoft releases API and development docs for Windows Home Server

In an effort to prepare developers for the upcoming Windows Home Server, Microsoft released the first beta of the Windows Home Server software development kit (SDK) documentation. Though Windows Home Server is still in beta (Beta 2), Microsoft is looking to introduce independent software vendors to the platform as soon as possible. HangZhou Night Net

The documentation released today includes guidance for creating Windows Home Server applications, tutorials, sample code, and an overview of the application programming interface (API). The three API files that developers will use are HomeServer.dll, HomeServer.idl, and HomeServerExt.dll. Objects that the API will offer access to include client computers, shared folders, application folders, managed volumes, hard disks, backup jobs, and notifications—all on the server. HomeServerExt.dll will give developers access to the Windows Home Server console so that it can be extended with new tabs and functionality.

In an e-mail to Ars Technica, a company spokesperson said that Microsoft expects ISVs and hobbyists to create tools for system protection, media sharing, home security, and home automation. "The SDK will enable hobbyist and professional developers to build applications that run on Windows Home Server and extend the functionality of the Windows Home Server Console on home computers, or to build applications that run on a home computer and connect to Windows Home Server," the spokesperson wrote.

Like every other Windows operating system, malware and virus protection are certain to be the focus of many software vendors. We can assume that OneCare will be supported, and companies such as Trend, McAfee, and Symantec are almost certainly going to release security tools geared toward the Home Server administrator.

With Microsoft planning to make Windows Home Server gold in the first half of 2007, it will be interesting to see what kinds of tools ISVs release for the product in the coming months. The company has said that a major goal with Windows Home Server was to create a fantastic platform for developers, but there's no guarantee that it will catch on.

Given the positive buzz surrounding Windows Home Server, we foresee a strong "hobbyist developer" following, but that depends in no small part on the quality and functionality exposed by Microsoft.

Google Desktop comes with junk in the trunk

Being huge fan of Quicksilver (the Swiss army knife of Mac utilities), I was pretty excited to check out the new Mac version of Google Desktop. As soon as it was released yesterday, I grabbed the download, and gleefully fired up the… installer downloader? HangZhou Night Net

And that was about the end of the excitement. Instead of the usual disk image with an application, Google actually gives you the Google Updater, which acts as an installer for all of Google's applications, and downloads the latest version on demand. I'm not a huge fan of extra, useless applications (I don't mind them when they're useful), so the tedious install process and choice to use Google Updater were strike one for me. Further cementing this is the fact that Google Updater Helper is always running in the background, unless you kill it.

After I dealt with the installation, and finally got Google Updater to close (and killed the process), I was still pretty pumped, so off to Google Desktop I went. Upon launching it for the first time, Google Desktop starts indexing. And indexing. And indexing. There are options to change which folders it looks at, but I'm not sure if cutting down the number of folders will stop it from indexing everything at first launch. On my 1.5 GHz Powerbook, indexing my 47GB of files took two and a half hours, which isn't terrible. More annoyingly, though, is that fact that the indexing process cannot be easily stopped. There's no way to shut it down from the preference pane, and killing the process simply causes it to re-spawn. I'm going to count this as strike two.

Once I finished with the lovely task of trying to kill the indexer, I began to wonder just what other tricks Google Desktop had up its sleeve. Apparently John Gruber had the same question, as he's taken a look at just what files Google Desktop puts where. If you're bothered by applications throwing things around willy-nilly, stay away from Google Desktop. In addition to the application, it comes with an InputManager (which tend to be nasty hacks, and are deprecated in Leopard) that gets installed for all users, and also add some daemons and a preference pane. What really gets John, though, is the fact that Google Desktop installs to the sacred ground that is /System/Library, which apparently shouldn't be touched by developers. After reading his piece, I chalked up strike three for Google Desktop, and decided to dump it.

Fortunately, the uninstall functionality works quite well, and gets rid of all of Google Desktop's tendrils. Now, I know I'm a bit picky when it comes to my software, and I'm certainly hopeful for what Google Desktop may become in the future, but I don't see any reason to use it at this point in time. It may well turn into a great utility, but for now it's bloated with extra stuff, lacks features, and generally doesn't behave like a Mac app can (and should, especially with regards to install locations). As such, I can't see myself re-installing it any time soon.

Guitar Hero 2

I said, ARE YOU READY TO ROCK?

HangZhou Night Net

Guitar Hero 2
Developer: Harmonix
Publisher: Activision
Platform: Xbox 360
Price: $90 (shop for this game)
Rating: Teen

I have many memories of Guitar Hero. The first time I played was at a trade show, and when I held that first guitar controller and clumsily tried to play a song it was like standing in the path of a huge wave. I knew that the game was going to be a hit. When it was released and none of the game stores knew what I was talking about when I called asking for it, I also knew it would be a slow burn. It was a game that a few people would buy, show their friends, and then their friends would buy it. I sold around 10 copies of it myself after friends got hooked at my house and then ran to the store to get their own setup to practice for our epic Friday night multiplayer sessions.

When Guitar Hero 2 came out we all got together again to hook a PlayStation 2 up to a projector and a sound system to play the game in style. There were about 30 people there, and I remember it felt like a concert: when someone nailed a tough-looking solo the people watching would cheer and pump their hands in the air. People would sing along while they waited for their turn, and would boo particularly feeble-fingered players. Everyone had a huge grin on their face as they handed the controller off to the next person. At some point one of us dropped to his knees and played the guitar behind his head— the crowd went crazy.

People complain and moan about how Guitar Hero isn't as rewarding as playing the real guitar or that it's a waste of time. All video games, if you look at them through that filter, are a waste of time. Why game when you could be learning real-world skills? But many people I know have families, full-time jobs, and huge CD collections. Being able to taste the thrill of rocking out without spending the first few months of their training dealing with scales and bloody hands is a great thing. Guitar Hero makes it easier for anyone to feel that rush of playing live music, and while the real guitarists can be as snooty as they want, I'm not going to look down my nose at someone else's fun.

Now that Guitar Hero 2 is out on the 360, its main job is to justify its own cost: $150 for the game and an extra controller is a high price for a game that's been out for a significant amount of time already. Sure, it's in high definition now, but Guitar Hero 2 was never a graphical feast to begin with. If you have a 360 and no PlayStation 2 it makes the price easier to take, but will you have anything worth lording over your friends trapped in the last generation?

I've played all the songs a few times, unlocked all the modes, and almost smashed the guitar after five-starring a Butthole Surfers song. Let's see those lighters; we're going to see if this game adds more rock to the already near-perfect PS2 edition.