Hey there! Thanks for dropping by Theme Preview! Take a look around
and grab the RSS feed to stay updated. See you around!

Steven Warshak, the man behind the "natural male enhancement" product Enzyte often advertised on late-night TV, has successfully challenged the government's ability to access his e-mails without obtaining a search warrant or giving notification to Warshak. HangZhou Night Net

The Sixth Circuit Court of Appeals ruled yesterday that the government had acted improperly in its wire fraud and money laundering case against Warshak and his company. As part of the case (which we reported on earlier), the feds secured a court order under the Stored Communications Act (SCA)that allowedthem to access Warshak's stored online e-mail.

A court order does not require the full "probable cause" level of evidence demanded by a subpoena, but it does involve some judicial oversight. Normally, a court order of this kind requires notification so that the subject of the order can challenge it, but in this case, the judge gave the government 90 days to look at the e-mails before it needed to contact Warshak. This is allowed under the SCA, but Warshak argued that gaining access to his e-mail without 1) a warrant or 2) a court order with notification was a violation of the Fourth Amendment.

The Appeals Court ruled in Warshak's favor. In the decision, the Court noted that the rules "still allow seizures of e-mails pursuant to a warrant or with prior notice to a subscriber" but that the ability to get the court order without notification was no longer allowed.

The court also responded positively to the idea that e-mails should be given the same privacy protection as phone calls. This means that getting access to an ISP's customer information database would be allowed without a warrant, but getting access to the actual text of the e-mails would not. In the telecom world, this is analogous to the "pen register" that grabs data about what phone numbers are being dialed but does not provide access to the content of the call.

The Court found that "individuals maintain a reasonable expectation of privacy in e-mails that are stored with, or sent or received through, a commercial ISP," dealing a blow to government attempts to get easier access to e-mails stored with an ISP than those stored on a suspect's own computer. Protecting the privacy of e-mail is "as important to Fourth Amendment principles today as protecting telephone conversations has been in the past."

"E-mail users expect that their Hotmail and Gmail inboxes are just as private as their postal mail and their telephone calls," said EFF staff attorney Kevin Bankston, who helped draft an amicus brief in the case. "The government tried to get around this common-sense conclusion, but the Constitution applies online as well as offline, as the court correctly found. That means that the government can't secretly seize your emails without a warrant."

With that important e-mail issue resolved, the case against Warshak will continue.

Gateway has announced that it is recalling 14,000 notebook batteries from laptops sold during the months of May and June 2003. The recall is in response to high temperatures that occur in lithium-ion batteries that could potentially cause a fire. The faulty batteries can be found in Gateway notebook models 400VTX and 450ROG and will be replaced for free. Not every model uses these batteries though, so here's how to find out if yours does. HangZhou Night Net

To find your battery number, you'll need to remove your battery from your laptop. Before doing this, make sure your LCD is closed, and your laptop is face down, back up. Unlock the notebook battery lock and slide open the battery release latch, then slide the battery out of the bay. On the battery you'll find two numbers: a serial number and a battery part number. If your battery has part numbers 6500760 or 6500761, then you have one of Gateway's faulty batteries. To exchange your battery for a new one, fill out Gateway's Battery Exchange Request Form.

Last year Sony issued a worldwide recall for Sony-manufactured lithium-ion batteries that shipped in Lenovo/IBM, Dell, Apple, and Toshiba notebook computers after battery malfunctions caused a Lenovo ThinkPad battery to burst into flames in a Los Angeles airport. Earlier this year, Lenovo recalled ThinkPad batteries for over 208,000 notebooks after overheating issues caused damage to a number of notebooks. Speaking of Toshiba, the company yesterday stepped up its own notebook battery recall after a laptop caught fire in Britain last month. Toshiba is currently in talks with Sony to discuss a reimbursement for the recall, which is expected cost Sony roughly $400 million when all is said and done.

A new battery standardization project hopes to make recalls a thing of the past. The Association Connecting Electronics Industries (IPC) Lithium Ion Battery Subcommittee said last year that the IPC expected to have a completed a lithium-ion battery standard for laptops and handheld devices by this time (it has yet to arrive). In December the IEEE said that it expected its revised IEEE 1625 standard to be completed by the end of 2007, at that rate though, we likely won't see the finished product until sometime in 2008.

Without an official standard for lithium-ion batteries, manufacturers like Matsushita have taken matters into their own hands. Last December Matsushita developed a safer lithium-ion battery for notebooks that uses a heat-resistant insulator between the cathode and the anode of a battery that prevents punctures from short-circuiting batteries.

After a few runs, I began to ask whether I was pushing myself hard enough. I could always try to up my personal best, but that isn't always the best indication of whether you are working as hard as you should be. Ideally, I would use a heart rate monitor, but that is significantly more money than I'd already spent. Second best would be a personal trainer to motivate me to work my hardest, but unfortunately that would be even more expensive than the heart rate monitor.HangZhou Night Net

So what am I (and you) to do? Luckily for us, Nike has us covered. On the iTunes Store, the shoe company has a variety of different workouts available to help keep your running steady. Today we will look at Improve our Endurance 1.

There is the saying "Nothing in life is free." Well, these workouts are no exception. Some might even consider them a poor value, but hold any judgment until the end. For $14.99, you get ten full-length songs from the hip-hop genre, including tracks by Obie Trice, Busta, and the Pussycat Dolls. You also get an additional track entitled the "Continuous Mix," which is the full workout track, and a digital booklet. The "Continuous Mix," which changes songs to go along with the speed in which you are supposed to run at any given time, also features a voiceover with training instructions (the continuous mix only works with iPod nanos, by the way). Here, the instructions say to do a ten-minute warmup, four sets of three-minute speed intervals, and then ten minutes of cool-down.

I know what you are asking: "If that's the workout routine, why not just do that? Why not just use music you already have and a stopwatch?" For some, that method might be enough, but for those of us that like the encouragement and time updates that a personal trainer, a coach, or a voiceover track provides, this workout works well. There is something to be said for a voice telling you that you are halfway there or that there are "only three minutes" remaining. The change of tempos and intensity throughout the workout does a lot for your mindset during your run, too. If the 42:49 running time seems like too much or doesn't fit into your schedule, you can always do what I do and tailor it to your ability or needs. For me that means not using the entire 42 minutes but instead using the track for a given distance.

Here is the bad: if you are to the point where you can run intervals more than twice a week, and this is the only interval training track you have, this music will get pretty boring pretty quickly. If you run this interval training once a week, it isn't so bad, but you will begin to feel some hatred for the Pussycat Dolls after a while. Be warned!

AT&T has quietly begun offering DSL service for $10 per month for new customers. Offered as part of the concessions the telecom made to the Federal Communications Commission in order to gain approval for its merger with BellSouth, the speed is nothing to get excited about: 768Kbps down and 128Kbps up. HangZhou Night Net

AT&T is also doing little to publicize the new offering. In fact, I was only able to discover any reference to the low-price service by clicking on the Terms and Conditions link at he bottom of AT&T's residential high-speed Internet product page. A note on AT&T Yahoo! High-Speed Internet buried six paragraphs down says that the "basic speed ($10.00)" tier is available to new customers only, those who have not subscribed to AT&T or BellSouth DSL during the past 12 months, and the service requires a one-year contract.

Customers must also order phone service to get the budget-priced DSL service; those looking for cheap, naked DSL should look elsewhere. Those living in BellSouth's former territory can get naked DSL for the next two-and-a-half years, however.

Along with the budget high-speed Internet and naked DSL, AT&T also promised to maintain a "neutral network and neutral routing in its wireline broadband Internet access service" while also giving up its rights to the 2.5GHz spectrum. (WiMAX provider Clearwire recently completed the purchase of AT&T's unused 2.5GHz holdings.) In addition, AT&T must offer broadband to 100 percent of all residential living units in its territory, with 85 percent of that delivered by wire.

As is the case with the naked DSL offering, AT&T is only required to offer the $10 per month tier for the next two-and-a-half years. After that, the company is free to make whatever changes it wants to the service.

It's only $5 cheaper than AT&T's current lowest-priced service, but at $10 per month, the service could appeal to budget-minded consumers—especially those who are paying about that amount for dial-up service. More importantly for AT&T, it gives the company another platform from which to pitch its U-Verse broadband and IPTV service. After two and a half years of 768Kbps service, U-Verse may look very attractive to lower-tier customers.

24-hour test drive: PC-BSD

A different flavor of BSD

PC-BSD is not a Linux distribution, but rather it could be considered among the first major FreeBSD-based distributions to live outside of the official FreeBSD. Like most distributions, it has implemented certain features in a way that attempts to distinguish it from the competition, and I will focus mostly on these differences. This test drive is intended to give an overview of what PC-BSD is and why one would consider using it. HangZhou Night Net

First and foremost, PC-BSD is an attempt to make a user-friendly Unix. Many Linux distributions have a similar focus and attempt to achieve it in different ways, and PC-BSD should be considered alongside these distributions. Additionally, PC-BSD's developers went to great efforts to make users who are transitioning from Windows more comfortable—more on that later.

The version I tested was PC-BSD 1.3, which is based on FreeBSD 6.1, X.org 6.9, and KDE 3.5.5—none of which are the latest release. The use of older releases fits nicely with PC-BSD's focus on releasing an OS that is stable, secure and friendly. There is a testbed release available for those willing to live on the edge (and bleed a little) that includes more recent software… and the problems associated with it. PC-BSD appears to be available only in the 32-bit x86 flavor.

Hardware test bed:AMD Athlon 64 3200+MSI RS-480-M2 motherboard1GB RAM250GB SATA hard drivePCIe NVIDIA GeForce 7600The installation process

The install program is fast and simple, with limited options for installation. Upon first boot, you are dropped into a ncurses menu that lets you launch the graphical installer, drop into an emergency shell, and so forth. The installer can optionally be run in VESA mode if your video card is not properly detected and initialized (such as the case with my PCIe NIVIDIA GeForce 7600). The fallback mode can be selected from the installation menu.

Once in the graphical installer, you are given a very easy-to-use installation procedure that happens to be a single program running inside Fluxbox. This is only noticeable to the trained eye, as the only clue that you even have a window manager is a one-pixel line running along the bottom of the screen that turns into a taskbar when your mouse gets too close. The installer allows you to choose a "Desktop/Laptop" installation versus a "Server" installation, and it includes things such as automatically setting up the OpenBSD PF (Packet Filter) firewall, which it refers to as the Personal Firewall. Same letters in the acronym… very clever.

There is no package selection, and as a result, installation is very fast, as it's simply a matter of watching the installer extract some tarballs. No configuration is really performed at the time of installation, except for those questions the installer asks. The total time to install was around 20 minutes.

Installation went smoothly until the reboot for me, due once again to my X driver problem. If I was not a *nix professional, I would have panicked at this point. Since I am, I was able to boot into safe mode, log in as root, remount the filesystem as read-write, and try to edit my xorg.conf file. In safe mode, I found that something was wrong with the line terminations when using vi, so I had to use less to view the files and then construct a sed substitution to change the video driver from "nv" to "vesa." Upon reboot, everything worked swimmingly. I should note that the bootloader PC-BSD installs is the FreeBSD default bootloader, which detected my existing SATA drive and always allowed me to boot into my preexisting operating systems if I ran into trouble.

I had selected the option during install to automatically log me into my main user account on boot, and it did just as I requested. I must note that KDE seemed to load much faster on PC-BSD than I'm used to; probably around three times faster than my Kubuntu installation on my other drive (which either says something bad about kubuntu or something great about PC-BSD). In fact, the whole system felt very snappy.

Two law school students filed a lawsuit against the administrator of a web site and 28 of the site's users last week for psychological and economic injury. The two plaintiffs, anonymously listed as Doe I and Doe II, are female students at Yale Law School and claim that the users of a third-party law school message board have consistently and regularly made such disparaging remarks about their characters that it has cost them not only their emotional wellbeing, but internships and jobs. And despite repeated requests to remove the offensive posts, the site's administrators continually refused to do so. HangZhou Night Net

The posts occurred on AutoAdmit, a site that describes itself as the world's "most prestigious" college discussion board and claims to help students with law school information, hiring practices at law firms, and more. The comments against Doe I and II started as far back as 2005 when a poster from Doe I's undergrad university, Stanford, started a thread warning everyone at Yale Law School to "watch out" for her in a thread titled "Stupid Bitch to Attend Yale Law." Thus begun the string of public character assassinations, rumors, and (repeated) rape threats. Various users on the site also posted what she claims to be false information about her LSAT score, accused her of participating in a lesbian relationship with a Yale Law School administrator in order to gain admission, and encouraged others to warn law firms about her alleged illegitimacy.

Similarly in 2007, Doe II became the topic of several threads on AutoAdmit, focusing mostly on certain body parts (complete with pictures of her ripped from sites like Facebook) and also with repeated rape threats. Some posters encouraged others to stalk her and take more photographs, while continuing to encourage various lewd acts.

The complaint

In the complaint as seen by Ars Technica, Doe I and II claim to have lost sleep, fallen behind on schoolwork, suffered strained personal relationships with their families, and were forced to attend therapy as a result of the postings on AutoAdmit. Additionally, Doe I claims to have lost job prospects. She says that at some point, she applied for 16 different on-campus interviews at Yale, which resulted in a mere four callbacks and zero offers. "On information and belief, it is unprecedented for a second-year law student from Yale to participate in so many interviews without obtaining a single summer associate offer," the complaint reads. Her academic qualifications were similar to that of other classmates who had received offers, the complaint says.

The suit names the online pseudonyms of 28 anonymous posters on AutoAdmit in hopes of using subpoenas to identify them in real life. The two women are also suing site administrator Anthony Ciolli, who they say knowingly allowed and profited from these posts staying on the site despite AutoAdmit's "no outing" policy—a policy that states that posts that contain real-life information about other users will be deleted immediately. The women are also concerned that the posts on AutoAdmit are showing up in Google results when users perform queries on their names. The complaint itself mentions that several posters on AutoAdmit have attempted to "googlebomb" the women's names with defamatory comments, and that the first several Google hits for one of the women's names do in fact point to threads from AutoAdmit about them.

Fallout? What fallout?

Targeting Ciolli may prove difficult, however, partly because he did not author the posts himself. Ciolli may also be protected by laws stating that a site's administrators aren't responsible for the posts made by its users, such as the DMCA's Safe Harbor for copyrighted content. In March, Ciolli also told the Washington Post that his co-administrator, Jarret Cohen, was solely responsible for approving or deleting comments and that he had no authority to do so. As an interesting tidbit of side trivia, Ciolli—a law graduate himself—recently had an offer from a Boston law firm rescinded over his involvement with and the content on AutoAdmit, according to the Wall Street Journal.

Discovering the identities of the 28 posters could be difficult as well, since AutoAdmit apparently does not retain IP addresses for its users and does not require them to register with real names, according to the Washington Post—just valid e-mail addresses. However, those e-mail addresses could still eventually give away the identities of the posters involved, as it's probable that the e-mail service providers have more personal information stored about their users than AutoAdmit does and could be forced to give it up through subpoenas.

Ciolli and the AutoAdmit gang may not exactly have precedent on their side either. A student blogger from UC Berkeley recently lost a defamation case brought against him by journalist Lee Kaplan last week. The student, Yaman Salahi, had set up a blog called Lee Kaplan Watch in which Salahi cited articles written by Kaplan and publicly disputed various claims. Kaplan sued Salahi for business interference and libel, which Salahi lost in small claims court not once, but twice. On his blog, Salahi argues that because he was sued in small claims court and not a "real" court, he was unable to take advantage of California's anti-SLAPP—Strategic Lawsuits Against Public Participation—protections. "I have absolutely no doubt that had this lawsuit been filed in a real court, I would have won," Salahi wrote.

Doe I and II are asking for punitive damages in the amount of $245,000 as well as unspecified actual and special damages. The complaint also requests that the threads be permanently removed from AutoAdmit and that the administrators authorize Google to permanently remove cached versions of the threads.

Some experts believe that this case will go a long way towards testing the legal limits of anonymous Internet postings. University of Texas law professor Brian Leiter told Reuters that "the most vile posters on that board are two subpoenas away from being outed," which he says led to "much amusement" by AutoAdmit posters. "But they are about to find out that this is how it works," he added ominously.

A new study says that on average, more than half of the ink from inkjet cartridges is wasted when users toss them in the garbage. Why is that interesting? According to the study, users are tossing the cartridges when their printers are telling them they're out of ink, not when they necessarily are out of ink. HangZhou Night Net

The study by TÜV Rheinland looked at inkjet efficiency across multiple brands, including Epson (who commissioned the study), Lexmark, Canon, HP, Kodak, and Brother. They studied the efficiency of both single and multi-ink cartridges. Espon's printers were among the highest rated, at more than 80 percent efficiency using single-ink cartridges. Kodak's EasyShare 5300 was panned as the worst printer tested, wasting 64 percent of its ink in tests. TÜV Rheinland measured cartridge weights before and after use, stopping use when printers reported that they were out of ink.

That's the first problem. Printers routinely report that they are low on ink even when they aren't, and in some cases there are still hundreds of pages worth of ink left.

The second issue is a familiar one: multi-ink cartridges can be rendered "empty" when only one color runs low. Multi-ink cartridges store three to five colors in a single cartridge. Printing too many photos from the air show will kill your cartridge faster than you can say "blue skies," as dominant colors (say, "blue") are used faster than the others. Therein lies the reason Epson backed the study: the company is singing the praises of its single-ink cartridge approach, an approach which is necessarily more efficient in terms of wasted ink because there's only one color per cartridge, and thus only one cartridge to replace when that color runs out.

Single ink cartridges aren't exactly perfect, however. Such cartridges still were reported as empty with an average of 20 percent of their ink left, which means that an entire cartridge worth of ink is wasted for every five which are used. Given the sky-high prices of ink, this is an alarming find. Epson's own R360 posted the best numbers, with only 9 percent wasted. Yet again, Epson commissioned the tests, so we must ask what's missing.

The study did not measure how much ink is lost due to lack of use, or through cleaning processes. Inkjet cartridges are known to suffer from quality problems if they are not used for long periods of time, sometimes "drying up." This problem has been addressed in recent years, but it has not been eliminated.

The study also did not calculate the total cost per page, which arguably is more important than efficiency. If Epson's multicartridge approach is more efficient, it could nonetheless still be more expensive per page than multi-ink cartridge systems. In its defense, Epson and TÜV Rheinland said that their study focused on the ecological impact of inkjet printing. This is a familiar argument: hybrid cars have also been criticized for their supposed efficiency, with debates raging as to whether or not your average driver will ever see cost savings from better miles-per-gallon given the relative expensive of hybrid engines.

As such, anyone in the market for an inkjet printer still needs to compare specific models to one another to get a feel for efficiency, and Epson's efficiency claims needs to be weighed next to the comparative cost of competing inkjet solutions.

Still, the unintended result of this study is that regardless of the battle between single- and multi-ink cartridges, inkjet printers themselves are significantly off the mark when it comes to reporting the fullness of their cartridges. As the Eagles would say, you're best off when you "take it, to the limit." (Or with a laser printer, one can always do the toner cartridge cha-cha.)

Further reading:

Kodak
inkjets doomed to failure, says EpsonEpson
pushes single-ink cartridges

On Monday, Microsoft launched a new version of its MSN Mobile web site. Besides fitting nicely inside a mobile browser, the site offers most, if not all of the content that comes with MSN. That content includes access to Windows Live products like Windows Live Hotmail, Windows Live Local, Windows Live Messenger, Windows Live Spaces, and Live Search. HangZhou Night Net

One of the major aspects of the new MSN Mobile site is that it has been developed with just about every mobile browser in mind. Specifically, Microsoft claims that the site will render appropriately for all browsers that utilize Wireless Application Protocols 1.2 and higher. Besides optimal rendering, Microsoft has also included quick links to frequently requested content such as e-mail, messaging, and maps.

One of the main user groups that Microsoft is focusing on with this new release is sports fans. Thanks to an exclusive deal with FOXSports.com, fans can use MSN Mobile to access statistics, schedules, scores, and player information. Because MSN Mobile also offers streaming content, there's a good chance that you'll also be able to access plenty of JB, Terry, Howie, and Jimmy's insightful commentary during football season. If that's not a deal-breaker, I don't know what is.

Consumers aside, Microsoft is also positioning the new MSN Mobile to operators. Using Microsoft adCenter and ScreenTonic, the company hopes to generate advertising revenue from both local and national businesses. The acquisition of ScreenTonic will certainly be of use here as its STAMP technology considers factors like screen size, geolocation, and formats when creating targeted advertisements.

Currently, MSN Mobile is only available in the United States, but Microsoft plans to expand the site to a global market throughout the remainder of 2007.

iPhoneDevCamp to whip developers into shape

While there's currently no real SDK for the iPhone (and little chance of one to boot), it's important to remember that developers at least have some ability to write applications for the iPhone. The use of web applications for the iPhone has been debated and criticized any number of times, but some developers have been busy writing sample iPhone applications with some excellent results. HangZhou Night Net

There was a lot of discussion about iPhone development at WWDC this year and even a session about creating iPhone-friendly sites. But there are lots of aspiring developers out there who didn't attend WWDC, as well as those who did attend and are craving more knowledge. For everyone interested in iPhone development, the iPhoneDevCamp is shaping up to be the cool place to go. It's more of an intensive boot camp than a conference, since the primary goal seems to be to get as many people as possible together to talk about, learn about, and write applications for the iPhone. There will be a few presentations in the beginning as well, but most of the time will be dedicated to actual development. And developers, if you don't have a shiny new iPhone by July, don't worry: it's recommended but not required.

The good news is that the DevCamp is free, and will be held from July 6 to 8 in the Bay Area. No venue has been picked yet, but the organizers are trying to nail that down quickly. The organizers are also actively seeking sponsors and a few presenters, so if you can help out in either of those capacities, you might want to let them know.

It seems like a great idea, but everything is a bit tentative for now, and there are only a few weeks left until July 6. There are 27 attendees so far, but I'm really hoping that everything will come together and the iPhoneDevCamp will be a big success and a big step for iPhone development. If you're interested in dropping by, the iPhoneDevCamp wiki has all the details, but you can follow the status on Twitter and a few other sites, as well.

In the Internet traffic race, P2P used to be way out in front. For years, P2P traffic eclipsed HTTP traffic as broadband users slurped down music and movies, some of which were actually legal. But P2P fell behind this year; for the first time in four years, HTTP traffic is out in front. HangZhou Night Net

Ellacoya Networks, makers of deep packet inspection gear for carriers, has pulled together some statistics on one million broadband users in North America, and its findings show that HTTP traffic accounts for 46 percent of all broadband traffic. P2P applications now account for only 37 percent.

Data source: Ellacoya Networks

Chalk it up to YouTube and other Internet video sharing sites. The surge in HTTP traffic is largely a surge in the use of streaming media, mostly video.

Breaking down the HTTP traffic, Ellacoya says that only 45 percent is used to pull down traditional web pages with text and images. The rest is mostly made up of streaming video (36 percent) and streaming audio (five percent). YouTube alone has grown so big that it now accounts for 20 percent of all HTTP traffic, or more than half of all HTTP streaming video.

Looking over all the numbers, one of the most surprising result is the continued success of NNTP (newsgroups) traffic, which still accounts for nine percent of the total. Clearly, newsgroup discussions (and, ahem, binaries) are still big business.

The data may provide some ammunition for companies that favor traffic shaping on their networks. Between P2P, newsgroups, and streaming HTTP video traffic, the vast majority of Internet traffic is non-critical (i.e., no one's going to die or lose $20 million if they don't download a YouTube clip or a new song in under a minute). Networks that want to ensure priority transmission of VoIP calls, traditional HTTP web browsing, medical imaging, etc., have a strong incentive to throttle back that flood of non-critical traffic when the network is experiencing heavy loads. That could bring them into conflict with proponents of strict network neutrality, though, who don't want to see any sort of packet prioritization.

Frank's big Canadian crush on Dr. Cameron aside, the 360 port of Command Conquer 3 was a solid game. A solid game that let you taunt people using video—an idea that sounds great in print but scares me in reality. While the PC version has enjoyed one or two patches, the 360 version is about to get its first batch of downloadable content from Live Arcade. Here are the packages that are going live: HangZhou Night Net

Name:Ground Zero Map
Price:Free
Availability: All Xbox Live regions
Dash Details: Ground Zero Map Lead your army against a rival commander in this 1v1 map featuring a huge meteor impact crater. Download this free map now! This map is also playable in single-player skirmish mode.

Name:Map Pack 1 Developer Interview
Price:Free
Availability: Not available in Japan
Dash Details: Get the developer’s insider tour of the five intense map designs available in Command & Conquer 3 Map Pack One. Discover the key strategies to dominating the Tiberium fields and securing victory, whether you play GDI, NOD or Scrin!

Name:Map Pack 1
Price:500 Points
Availability: Not available in Japan
Dash Details: Map Pack 1 What kind of commander are you? Are you a defensive specialist, an air-superiority junkie, or a clandestine operations fanatic? Whatever your preferred strategy, test your skills and your wits against other commanders in five new multiplayer maps. Ranging from brutal 1v1 shootouts to gigantic 2v2 showdowns, this pack has something for every commander. Includes: Black’s Bigger Battle | Tiber River Valley | Frontier Fracas | Tiberium Gardens III | Tournament Desert Redux. All maps are playable in single-player skirmish mode as well.

Name:Factions Picture Pack
Price:80 Points
Availability: Not available in Japan
Dash Details: Whose side are you on? Support your favorite Command and Conquer 3 faction with these gamerpics! Includes 4 gamerpics: GDI, NOD, Scrin, and unique Tiberium EA logo.

A free map? Our cup runneth over. $6.25 for five new maps is an okay deal, but the dollar for that picture pack…are people buying these things? I'm glad that EA is still supporting the game—and these downloads aren't nearly as annoying as some of the other EA offerings in this area—but I think I'll keep both my nickel and my dime in my pocket.

When Ars last examined the state-of-the-art in gecko mechanics, researchers were measuring the strength of single fibers from the bottom of their feet, hoping to gain insights into how these translate weak van der Waals attractions into the ability to scamper up walls. Since then, researchers have been making pretty good progress in fashioning carbon nanotubes into fibers with similar adhesive properties, allowing them to make adhesives that approach the ability of geckos to stick to surfaces. HangZhou Night Net

In a paper that should show up at PNAS later this week1, a research team has discovered that the lessons of the gecko go well beyond the properties of the individual fibers. It turns out that the biological version of these fibers are arranged in hierarchical clusters, and the research team involved sought to mimic this organization. They experimented with creating various bundles of carbon nanotubes, and compared their adhesive properties with both unbundled nanotubes and live geckos (join me, if you will, in imagining the gecko harness involved…).

Unbundled, their nanotube tape was nearly as adhesive as a live gecko, but as these same tubes were clustered into bundles, their strength went up. By the time the authors optimized the combination of fiber length and bundle width, their tape was over four times stronger than a gecko: a square centimeter was sufficient to support nearly four kilograms. Although this was weaker than the initial strength of a standard piece of adhesive tape, the "gecko tape" had staying power. Its adhesive properties remained stable over time, while those of the adhesive tape dropped below those of the gecko tape after about five minutes.

Because of its reliance on van der Waals forces, the gecko tape had some unusual properties. These forces can work between any two surfaces, allowing the tape to stick to Teflon with roughly half the efficiency of its adhesion to a charged surface. Because the forces are proportional to surface area, peeling the tape works remarkably well: for most angles, peeling gently reduced the surface area, allowing the tape to come off with little force and no damage.

The only downside seems to be the consequences of overloading the tape. The failures tend to be catastrophic, and many of the fibers break and are left behind on the surface. Long term, this will degrade the performance of what is an otherwise reusable adhesive.

A second paper that should also be released soon2 has a potential solution to that. Instead of having the nanofibers contact the surface directly, it uses them to support a flat surface of similar material. That flat surface maximizes the area capable of undergoing van der Walls attractions, while the fibers create many individual points of failure that have to be overcome before the surface peels away. In tests, having the nanofiber backing improved the adhesive properties of the surface by over nine-fold compared to the surface alone, while failure of the adhesion left the fibers involved intact.

1: When PNAS releases the paper, it should appear here.

2: The draft of this paper uses a DOI that has been assigned to an unrelated paper, so I cannot link to it. Those interested can watch PNAS for Glassmaker et. al., Biologically inspired crack trapping for enhanced adhesion.

YouTube has launched localized versions of its video-sharing site in nine different languages. YouTube co-founder Steve Chen made the announcement at a press conference in Paris this morning, saying that the sites now have custom-translated pages and interfaces and that more local features were coming in the future. Those features will include channeling country-specific videos, categories, and sections onto each site, as well as displaying country-specific ratings and comments. HangZhou Night Net

Localized versions of YouTube now exist for Brazil, France, Ireland, Italy, Japan, the Netherlands, Poland, Spain, and the UK. Noticeably missing is a version localized for Germany, which the company says should be in the works soon, along with localized versions for other unannounced countries.

Along with the announcement came news that YouTube also signed partnership agreements with various content providers, including the BBC, French news channel France 24, France Televisions, and several Spanish channels. Additionally, YouTube has struck deals with football (soccer) clubs in Europe such as Chelsea, AC Milan, Barcelona, and Real Madrid.

In addition to the nine localized versions of the site, YouTube also made a foray into the mobile space this week with the launch of YouTube Mobile. Previously, the company had launched mobile services with specific carriers (such as Verizon and Vodafone), but launched a (mostly) universal mobile version of the site over the weekend. The mobile version of YouTube can be accessed at m.youtube.com from most major carriers. However, although the site itself loads for almost anyone with a data plan, the video player is not entirely compatible with all phones. BlackBerrys, for example, cannot play the videos from YouTube Mobile.

These strategic moves show that YouTube is still focused on expanding itself in other markets, despite holding the number-one position among video sharing sites in the world. Expanding into the mobile space is the popular thing to do these days, with companies like Google and Ask rushing to port their content to mobile phones before the offerings get too crowded. And localized versions of the main site might split up the otherwise unified community of YouTube, but will also help the company strike deals with smaller, local content providers and offer more targeted videos to its users.

Why the targeting? It's all about the ad dollars, of course. International targeting will allow YouTube to charge higher rates for their international traffic, which itself should grow as a result of localization.