The Big Dimmer Switch

EDN Admin

Well-known member
Joined
Aug 7, 2010
Messages
12,794
Location
In the Machine
<div class="ch9 intro Today, were introducing Vector. Its a blog about developers, platform, and big ideas/trends in tech, but with a little bit different bent on what folks are used to from Channel 9... Its been eight years since Channel 9 first introduced many of our key engineering folks to the developer community to deliver a transparent view into how we think about the technologies were working on, the teams doing the work, and the huge community of people in and around Microsoft that are connected to all of it. None of that changes ... the team continues to produce the great content that has personified Channel 9 since 2004. But were also going to spend some time on a more elevated view of what were thinking when it comes to our own strategy, our competitors, broad-based industry stuff, and of course the intersection of all of this with business. Sometimes itll be topical, other times it will just offer commentary on something that people are talking about. We will also have guest posters talking about what theyre doing that fits in this vein. So that brings us to an actual post on something meaningful ... the disruptive impact of the services model on businesses. - The Channel 9 Team In 2007, Nick Carr published " http://www.nicholasgcarr.com/bigswitch/ The Big Switch ," which chronicles the evolution of electricity from being locally-generated by businesses on their factory floors to something we all now consume as a pay-for-what-you-use utility. The book draws parallels between electricity and packaged software as a means to offer up a potential end-state for cloud computing, hence the title ... <span class="pullQuoteSource there will be, the book asserts, a switchover from in-house datacenters to software delivered as a utility, and its just a matter of time. Its a good book, not steeped in technical jargon but rather a set of thoughtful mappings between these two eerily similar eras of technology disruption. Nearly five years have gone by since it was first published, and a lot has happened (and not happened) relative to the pace of cloud adoption ... we know a lot more about what motivates companies to push some apps out the door and into public clouds in a hurry, while other apps will take their time, or maybe even continue to run on-premises in so-called private cloud environments. So it begs a bunch of questions: What is the end state for the services disruption? Is it public cloud platforms and SaaS, or a hybrid of public and private cloud deployments, combined with traditional IT? In other words, is the big switch actually more of a dimmer switch, in the sense that its not just a simple matter of on/off? Are there any other historical lessons or examples of disruption we can draw insight from? < Disruption is a great word ... if you talk to enough developers, IT folks, and industry pundits about services, "disruption" shows up as by far the most-used and (IMO) best descriptor for whats happening in computing today. Scenarios that used to be impractical, uneconomical, and just plain impossible are now fair game for developers to build and deliver, all because of increasingly cheap and abundant resources like compute, storage, and bandwidth, available at scale and on a pay-as-you-go basis. So when people talk about disruption, the shape of its impact on the market is generally assumed by most folks to be one of outright replacement, such as the advent of electricity as a utility, the combustion engines replacement of the horse-drawn carriage, and digital medias disruption of physical media, to name a few - basically, the end state in which the disruptive technology means buggy-whip obsolescence for the existing technology. But not all disruptions play out this way, and there are more than a few historical examples, my personal favorite being the captivating story of the microwave business. Seriously, its actually pretty interesting ... So heres the story (courtesy of http://en.wikipedia.org/wiki/Microwave_oven Wikipedia ): As we all know, atomic research started in the 1940s for military purposes, but one of the offshoots of it was the discovery that you could actually use microwave radiation to heat food. Like most technology disruptions (including cloud computing), the discovery & development pre-dated mass adoption by many years. In the case of the microwave, the earliest patents were filed by Raytheon just after World War II, and were licensed to Tappan for the first home-use microwave. It was introduced in 1955 and cost over $1,000, but not surprisingly it didnt do well in the market. Raytheon got back into the game by acquiring Amana and introducing the Radarange in 1967 for about $500, and thats really where market adoption began to take shape. In 1971, 1% of US households owned a microwave, and by 1986 it was 25%, and today its over 90%. Whats interesting here is how the adoption curve was shaped based on the markets education on what you could and couldnt do with this thing. Keep in mind that the value prop of the microwave was time-savings for the subset of cooking tasks for which the new technology could be used. Can you bake a cake with a microwave? Theyre not ideal for that. Can you thaw out frozen stuff? Yeah, its great for that. What about broiling a salmon? Well, no. How about reheating leftovers? Yeah, its perfect for that. Why do sparks fly everywhere when I put metal in it? You should really read the owners manual. This was all part of what could best be described as a partitioning process ... partitioning what you do in a kitchen between the existing thing and the new thing. How was this process accelerated? It was just outright education, in many cases through print and TV advertising, which were rife with "ideas" about what you could actually cook, but also by shipping microwave cookbooks with the actual units. Some of the recipe ideas were a stretch (Thanksgiving turkey in a microwave?), but over time, people figured it out and knew what they should and shouldnt be cooking with it, and that essentially determined the end state for the disruptive technology: every modern kitchen will generally have both a conventional oven and a microwave. < So what can we learn from this? Allowing for the fact that the oven business and computing are two entirely different animals, the biggest and most obvious parallel is the ongoing education of the market that were seeing now about which apps are and are not necessarily well-suited to public cloud deployment. In other words, within any business app portfolio, there are no-brainers for cloud deployment (web workloads, email, collab, CRM, test, HPC, etc.), while other apps and workloads are subjected to more scrutiny, at least for the time being (ERP, mission-critical apps, apps with HBI data, etc.). Every business, no matter how small, has a portfolio of apps, and this process of portfolio portioning is pretty similar to the task-partitioning process that shaped microwave adoption. In the software industry, we see this in business scenarios, in which theres a lot of focus these days on things like PII at massive scale, infrastructure security, data sovereignty, the regulatory environment, and a host of other factors that business folks consider as part of the go/no-go decision on cloud computing. At any rate, the end state is becoming increasingly clear: businesses end up with a mixed bag of delivery and deployment approaches to deal with the variable needs & complexities of each & every app in their respective portfolios, at least for the foreseeable future. If you pay attention to cloud computing rhetoric in the industry, hybrid cloud is the new black, but for our part, but the answer was there all along. Theres an obvious tension between the possibilities afforded by what is arguably the biggest shift in our industry since the advent of client/server, and the practical realities of technology change for businesses that are grappling with this new era of computing. But from a technical strategy standpoint, there is no confusion on our part ... the design point that defines cloud computing is the path forward for app dev: <span class="pullQuoteSource the next set of apps that matter will be designed for scale and elasticity. Theyll be resilient, multi-instance, and highly available. Everything were doing in the platform across Windows Azure and Windows Server is geared toward enabling developers to meet this design point with the new apps theyre building. This means that todays great debate is twofold: a.) where will these new apps run? And b.) where will those existing apps end up? Off-prem in the cloud, or on-prem in the data center? The answer is "yes." And thats really the point ... the discussion about * where * the apps run (and what most people fixate on when they talk about the services disruption) is orthogonal to the discussion about whether its an app that meets the bar for the cloud design point vs. an n-tier app running in a VM thats really yesterdays design point. Most businesses portfolio of apps will have both kinds, old school and new school, and theyll be partitioned across off-prem and on-prem. Cloud adoption in big companies still has a ways to go, but even in these early days, the emerging trend line is becoming increasingly clear. Nick Carr even called this one in the "Big Switch", in the form of this excerpt from pg. 118... <blockquote class="plain "...larger companies...can be expected to pursue a hybrid approach for many years, supplying some hardware and software requirements themselves and purchasing others over the grid. One of the key challenges for corporate IT departments, in fact, lies in making the right decisions about what to hold on to and what to let go."</blockquote> The idea that app portfolios will be partitioned in this way seems pretty intuitive to the folks that are grappling with the change, at least based on the customer discussions were having these days. To be clear ... the cloud design point is, without a doubt, what were headed toward with a new generation of apps that are going live in increasing numbers every day (and the subject of a future post), but their place in the broader business app portfolio makes its trajectory more akin to a dimmer thats turned up over time than a simple on/off. Thanks for listening – comments & feedback welcome. -Tim <img src="http://m.webtrends.com/dcs1wotjh10000w0irc493s0e_6x1g/njs.gif?dcssip=channel9.msdn.com&dcsuri=http://channel9.msdn.com/Feeds/RSS&WT.dl=0&WT.entryid=Entry:RSSView:3342c9c75e29446d962e9ff701209452

View the full article
 
Back
Top