With Cisco and Juniper sharing their plans, the SDN game gets interesting

For fans of software defined networking (SDN) — especially those paid to make sense of each vendor claim and then advise clients — this week brought a bonanza of news. Both Cisco and Juniper, the largest players in the networking space and the two with the most to lose as networking becomes more about software and less about smart boxes, both outlined their strategy for coping with SDNs. The question is, can these heavyweights beat the network commoditization looming on the horizon?…

Read More

Forget networking, the data center of the future is software defined too

Software defined networking made a big splash last week at the Interop show in Las Vegas with a variety of vendors pitching their vision of how abstracting out networking from the physical hardware will change the overall way IT services are implemented. But the most disruptive element of the entire software defined networking trend is that it enables the next era of IT, the software defined data center, a concept Steve Herrod the CTO of VMware introduced at the show.…

Read More

Carrier IQ and the continued erosion of operator trust

The Carrier IQ scandal is still unfolding, and all parties involved are trying to spin their side of the story pretty heavily. Meanwhile, the software, which monitors users' keystrokes and text messages and can see passwords and other vulnerable information, is said to be on more than 141 million devices. So it's worth looking at the various players to understand who is hurt and who is helped by the kerfuffle around surreptitious smartphone data collection. This brief research note tackles the question of what the Carrier IQ case means for consumers, device makers and, perhaps most important, the operators. Companies mentioned include AT&T, Research in Motion and Sprint. For a full list of companies, and to read the full research note, sign up for a free trial.…

Read More

OpenFlow and beyond: future opportunities in networking

The world of networking is changing, thanks to shifting traffic patterns, more widely distributed webscale systems and the economic need for the networking world to catch up to where the computing and server world is today. This trend toward networking virtualization has huge implications for vendors such as Cisco, Juniper, Arista, Dell and Intel, but it also could become the foundation for an entire new ecosystem of startups and value creation, much like what the creation of the hypervisor did for computing. In this research note we look at what network virtualization is, why we're moving toward it, what OpenFlow is and what the opportunities are for companies, both large and small, beyond that technology. Additional companies mentioned in this report include Facebook, SeaMicro and Zynga. For a full list of companies, and to read the full report, sign up for a free trial.…

Read More

The Structure 50: The Top 50 Cloud Innovators

In five short years, cloud computing has gone from being a quaint technology to a major catchphrase. Amazon and others are now moving at Internet speed, trying to offer better security, faster networking, more compliance and a host of other products that are attempting to meet the demands of startups, consumers and enterprises alike. On GigaOM's Structure channel, we cover the gear and software that comprises the cloud, the services and the people who are changing the industry. Now for the first time, we’ve decided to condense that knowledge into the Structure 50, a list of the 50 companies that are influencing how the cloud and infrastructure evolves. All of these players, big or small, have people, technology or strategies that will help shape the way the cloud market is developing and where it will eventually end up. Companies mentioned in this report include Amazon, Rackspace, Cloudera, China Telecom and SeaMicro. For a full list of companies, and to see the Structure 50 as one full report, sign up for a free trial.…

Read More

Big Data Marketplaces Put a Price on Finding Patterns

A decade ago, scientists would collect data over a period of years, upload that data to a supercomputer, then wait for the opportunity to run it during a scheduled time. The process took months — or even years. Now, thanks to cheap processing power and the availability of compute clouds such as Amazon’s EC2 or Microsoft’s Azure, researchers can upload their data and start processing it immediately, as long as they can pay for the CPU time. Even the government is using the cloud to process data. The National Oceanic and Atmospheric Association is using Amazon’s Web Services for its Ocean Observatories Initiative, a study surveying ocean temperatures to detect and predict climate change. And as James Watters, senior manager of cloud solutions development with VMware, notes, the coming trend will be moving your data to the cloud, rather than keeping it stored on hard drives or DVDs that are then uploaded to a supercomputer someplace, which makes the cloud a necessary tools for supporting future economies built around information. Analyzing huge data sets with access to seemingly unlimited compute power isn’t just a benefit for climate researchers or those seeking to decode the latest H1N1 virus. The huge amount of digital information generated by financial monitoring companies, our interactions with people and products on the web, and government data (to name a few examples) is something that analysts, app developers and average citizens can all benefit from. The challenge is making that data intelligible and accessible, and that's what Infochimps, Microsoft, and an emerging class of data marketplaces are aiming to do.…

Read More