机选一注|摇一摇

<address id="dvpfh"><listing id="dvpfh"></listing></address>

<sub id="dvpfh"><dfn id="dvpfh"><ins id="dvpfh"></ins></dfn></sub>

<address id="dvpfh"><var id="dvpfh"></var></address>

<sub id="dvpfh"><dfn id="dvpfh"></dfn></sub>

      <sub id="dvpfh"></sub>
    <sub id="dvpfh"><var id="dvpfh"><ins id="dvpfh"></ins></var></sub><address id="dvpfh"><dfn id="dvpfh"><mark id="dvpfh"></mark></dfn></address>

      <sub id="dvpfh"><dfn id="dvpfh"><mark id="dvpfh"></mark></dfn></sub>

          <address id="dvpfh"></address>

          <sub id="dvpfh"><var id="dvpfh"><mark id="dvpfh"></mark></var></sub>

          <thead id="dvpfh"><var id="dvpfh"><ins id="dvpfh"></ins></var></thead>

          <address id="dvpfh"><dfn id="dvpfh"></dfn></address>
            <address id="dvpfh"></address>
              <address id="dvpfh"><var id="dvpfh"></var></address>
                <address id="dvpfh"></address>
              Showing posts with label virtualization. Show all posts
              Showing posts with label virtualization. Show all posts

              Monday, February 28, 2011

              Making The Cut To Favorite Cloud, SaaS, And Tech Bloggers

              The Dealmaker Media has published a list of their favorite Cloud, SaaS, and Tech bloggers. Once again I am happy to report that I made the cut. I am also glad to see my fellow bloggers Krishnan and Zoli on this list who are the driving force behind Cloudave. I was on a similar list of top cloud, virtualization, and SaaS bloggers that they had published in the past.

              Under The Radar is one of the best conferences that I go to. This is the best place for disruptive start-ups to pitch an get noticed. They make a great attempt to connect entrepreneurs with investors and blogger like me. I have blogged about the disruptive early stage cloud computing start-ups as well as the disruptive start-ups in the categories of NoSQL and virtualization. Most of these start-ups have either had a good exit or have been doing well. The best example so far is Heroku's $212M exit. I met the Heroku founders at Under The Radar a couple of years back.

              I am looking forward to soaking up even more innovation this year!

              Tuesday, April 27, 2010

              Delphix Is A Disruptive Database Virtualization Start-up To Watch

              This is my second post on my impressions from the Under The Radar conference. Check out the first post on NoSQL.

              Virtualization is not cloud computing. However virtualization has significant potential when it is used to achieve cloud-like characteristics such as elasticity, economies of scale, accessibility, simplicity to deploy etc. I have always believed that the next wave of cloud computing is going to be all about solving “special” problems on the cloud – I call it a vertical cloud. These vertical problems could be in any domain, technology stack, or industry. Raw computing has come long way. It is about the right time we do something more interesting with the raw cloud computing.

              Delphix is attempting to solve a specific problem - database virtualization. I met the CEO Kaycee Lai and the VP of sales Jedidiah Yueh at Under The Radar reception the night before. They have great background in understanding the cost and flexibility issues around de-duplication from their days at EMC. They have assembled a great team including Alok Srivastava from Oracle who ran Oracle RAC engineering prior to joining Delphix. Most large database deployments have multiple copies of single database that customers use for purposes beyond production such as staging, testing, and troubleshooting. This replication is expensive from process, resources, and storage perspective and takes long time to provision instances. The founders saw this problem first hand at EMC and decided to solve it.

              At the core their offering is a read-write snapshot of a database. That’s quite an achievement. The snapshots are, well, snapshots. You can’t modify them. When you make this compromise they occupy way less space. Delphix took the same concept but created the writable snapshots and a seemingly easy to use application (I haven’t used it) that allows quick de-duplication based on these snapshots. You can also go back in time and start your instance from there.

              Delphix has great value proposition in the database virtualization - help the customers reduce their hardware and people – DBA and system administrators - cost at the same time accelerate the IT processes. I like their conscious decision not to go after the backup market. Sometimes you have a great product but if it is marketed in the wrong category with vendors fighting in the red ocean you could die before you can grow. They had the best pitch at the conference – very calm, explaining the problem, articulating the value proposition, emphasizing right people on the team, and identifying the target market. If you are an entrepreneur (or even if you are not) check out their pitch and Q&A. There is a lot you can learn from them.

              Monday, May 11, 2009

              Cloud Computing - Old Wine In A New Bottle?

              A recent cloud computing report from McKinsey stirred quite a controversy. TechCrunch called the report partly cloudy. Google responded to the report with the great details on why cloud is relevant. I appreciate the efforts that McKinsey put into this report. However I believe that they took a very narrow approach in their scope and analysis. An interaction designer, Chris Horn, from MAYA Design sent me a paper, The Wrong Cloud, which argues that the cloud computing is essentially an old wine in a new bottle and the big companies are fueling the hype.
              "Today’s “cloud computing” claims to be the next big thing, but in fact it’s the end of the line. Those corporate dirigibles painted to look like clouds are tied to a mooring mast at the very top of the old centralized-computing mountain that we conquered long ago."

              I appreciate that there are people out there who question the validity and relevance of cloud computing. This puts an extra onus on the shoulders of the cloud computing companies and others to make their message crisper and communicate the real values that they provide. I was recently invited at the Under The Radar conference where many early stage cloud computing start-ups presented. The place was packed with the venture capitalists closely watching the companies and taking notes. It did feel like 1999 all over again! I hope that we don't fuel the hype and deliver the clear message on how cloud computing is different and what value it brings in. Here are my arguments on why cloud is not just a fad:


              Utility style cheap, abundant, and purpose-agnostic computing was never accessible before: There are plenty of case studies about near zero adoption barrier for Amazon EC2 that allowed people to access the purpose-agnostic computing capabilities of the cloud computing at the scale that had never been technologically and economically feasible before. I particularly like the case study of Washington Post where they used Amazon EC2 to convert 17,481 pages of non-searchable PDF to searchable text by launching 200 instances for less than $150 in under nine hours. We did have massive parallel processing capabilities available to us such as grid computing and clusters but they were purpose-specific, expensive, and not easy to set up and access.

              Peer-to-peer and cloud computing are not alternatives at the same level: The MAYA paper argues that the cloud computing is similar to P2P. I believe these two are complementing technology. The P2P solves the last mile problem of client-side computing where as the cloud computing is a collection of server-side technology and frameworks that has centralized computing characteristics. BitTorrent is a great example of effectively using P2P for distribution purposes since the distribution problem is fundamentally a decentralized one that could leverage the bandwidth and computing of the personal computers. However I do see potential in effectively combining both the approaches to design an end-to-end solution for certain kinds of problems e.g. use CDN on the cloud with P2P streaming to broadcast live events.

              Virtualization and cloud computing are not the same: McKinsey's report on cloud computing recommends that organizations can get the most out of virtualizing their data centers against adopting the true cloud computing. I am a big fan of virtualization but it does not replace the cloud computing and does not yield the same benefits. Eucalyptus, an emerging cloud computing start-up, has detailed analysis on how cloud computing is different than virtualization.

              Friday, March 13, 2009

              Top Cloud, Virtualization, and SaaS Blogs - My Blog Makes The Cut

              The organizers of the Under The Radar Conference has listed me as "best bloggers and journalists who’ve distilled this foggy space down to a well-defined, understandable sector" in the category of the top cloud, virtualization, and SaaS blogs. Some of the other names include Nick Carr, Geva Perry, and MR Rangaswami. I have also been invited to the conference. I am looking forward to the event and the energy that all the early stage start-ups bring in!

              Thursday, October 16, 2008

              Greening The Data Centers

              Recently Google published the Power Usage Efficiency (PUE) numbers of their data centers. PUE is defined as a ratio of the total power consumed by a data center to the power consumed by the IT equipments of the facility. Google's data centers' PUE ranges from 1.1 to 1.3 which is quite impressive. Though it is unclear why all the data centers have slightly different PUE. Are they designed differently or are they all not tuned to improve for the energy efficiency? In any case I am glad to see that Google is committed to the Green Grid initiative and is making the measurement data and method publicly available. This should encourage other organizations to improve the energy performance of their data centers.

              The energy efficiency of a data center can be classified into three main categories:

              1. Efficiency of the facility: The PUE is designed to measure this kind of efficiency that is based on how a facility that hosts a data center is designed such as its physical location, layout, sizing, cooling systems etc. Some organizations have gotten quite creative to improve this kind of efficiency by setting up an underground data center to achieve consistent temperature or setting up data centers near a power generation facility or even setting up their own captive power plant to reduce the distribution loss from the grid and meet the peak load demand.

              2. Efficiency of the servers: This efficiency is based on the efficiency of the hardware components of the servers such as CPU, cooling fans, drive motors etc. has made significant progress in this area to provide energy-efficient solutions. Sun has backed up the organization OpenEco that helps participants assess, track, and compare energy performance. Sun has also published their carbon footprint.

              3. Efficiency of the software architecture: To achieve this kind of efficiency the software architecture is optimized to consume less energy to provide the same functionality. The optimization techniques have by far focused on the performance, storage, and manageability ignoring the software architecture tuning that brings in energy efficiency.

              Round Robbin is a popular load balancing algorithm to optimize the load on servers but this algorithm is proven to be energy in-efficient. Another example is about the compression. If data is compressed on a disk it would require CPU cycles to uncompress it versus requiring more I/O calls if it is stored uncompressed. Given everything else being the same, which approach would require less power? These are not trivial questions.

              I do not favor an approach where the majority of the programmers are required to change their behavior and learn new way of writing code. One of the ways to optimize the energy performance of the software architecture is to adopt an 80/20 rule. The 80% of the applications use 20% of the code and in most of the cases it is an infrastructure or middleware code. It is relatively easy to educate and train these small subset of the programmers to optimize the code and the architecture for energy-efficiency. Virtualization could also help a lot in this area since the execution layers can be abstracted into something that can be rapidly changed and tuned without affecting the underlying code to provide consistent functionality and behavior.

              The energy efficiency cannot be achieved by tuning things in separation. It requires a holistic approach. PUE ratios identify the energy loss prior to it reaches a server, the energy-efficient server requires less power to execute the same software compared to other servers, and the energy-efficient software architecture actually lowers the consumption of energy for the same functionality that the software is providing. We need to invest into all the three categories.

              Power consumption is just one aspect of being green. There are many other factors such as how a data center handles the e-waste, the building material used, the green house gases out of the captive power plant (if any) and the cooling plants etc. However tackling energy efficiency is a great first step in greening the data centers.

              Monday, August 18, 2008

              Cisco and Juniper eyeing the long tail of consumers for their second act

              Very few companies have excelled in business beyond 25 years only with their first act, a product or a business model. Some companies recognize this early on and some don't. The networking giants Cisco and Juniper seem to get this and are looking for their second act. You don't wake up one day and drastically change your business model. It's a conscious decision based on long term strategy with very focused short term execution that is required to get to the second act.

              Cisco started their "human network" efforts by acquiring Linksys and Susan Bostrom completely rebranded Cisco a couple of years back. Consumerization of the brand was a big leap from an enterprise-centric organization to get closer to non-enterprise consumers. Few days back Cisco announced the Q4 results and John Chambers emphasized that Cisco would invest into adjacencies.

              "..and we will use this time as an opportunity to expand our share of customer spend and to aggressively move into market adjacencies."

              On the other side of the networking world Juniper recently hired Kevin Johnson as their CEO who was the president of platform and services division at Microsoft. Competing with Cisco has been challenging and Juniper did have their own share of issues in the past but let's not forget this company started during the dot com era, had a spectacular performance, survived the burst, and kept growing. But now is probably the right time to look for the second act.

              For Cisco, what could the second act be? Other than the obvious long tail of consumer-centric human network strategy I see a couple of possibilities:

              1) Data Center Virtualization:

              The virtualization is a fast-growing market segment that has not yet saturated. The real boundaries of data center virtualization are blurry since it is a conglomeration of server, network, and storage virtualization. Customers don't necessarily differentiate between managing servers versus backing up data across data centers.

              This is an adjacency that Cisco can tap into with its current investments into data center virtualization switches such as Nexus 7000, strong ecosystem, and great service organization (The service revenue is 20% of the product revenue). In fact this was speculated when Cisco announced this switch.

              This could indeed strain its relationship with vendors such as IBM and make it precarious who OEMs Cisco's switches in their data centers. Companies with large ecosystem would inevitably introduce "co-optition" when they decide to sell into the adjacencies that are currently served by their partners. They will have to learn walking on a thin rope.

              Virtualization with scale can lead to rich business scenarios. Imagine a network virtualization switch that is not only capable of connecting data centers at high speed for real-time mirroring and backups but can also tap into the cloud for better network analysis. The routing protocols and network topology analysis require massive parallel processing that can be delivered from the cloud. This could lead to improvisation of many network and real-time voice and data management scenarios that otherwise wouldn't have been possible. Cisco's partnership with a cloud vendor could lead to some interesting offerings - think of it as network virtualization on steroids.

              2) Network SaaS:

              Network Managed Services has always been an interesting business with a variety of players such as IBM, Nortel, Lucent etc. This could be one of the adjacencies that Cisco might pursue and make it a true SaaS and not just a managed service. I won't be surprised if Cisco acquires a couple of key SaaS players in near future.

              On-demand and SaaS have traditionally been considered a software and utility play. The networking companies already support the data centers that provision SaaS services but they could go well beyond that to provide Networking SaaS that provisions, monitors, and maintains the networks as true SaaS offering and not just as a managed service. This could include everything from network management, security, and related services. Traditionally SIs and partners have played this role but networking companies could see this as an adjacency and jump into it since it is a natural extension from hardware to data center to managed services to a SaaS delivery. Instead of selling to a service provider who sells services to customers an effective SaaS can turn the model upside down by partnering with service providers instead of selling to them and sell to an ever growing long tail of consumers.
              <address id="dvpfh"><listing id="dvpfh"></listing></address>

              <sub id="dvpfh"><dfn id="dvpfh"><ins id="dvpfh"></ins></dfn></sub>

              <address id="dvpfh"><var id="dvpfh"></var></address>

              <sub id="dvpfh"><dfn id="dvpfh"></dfn></sub>

                  <sub id="dvpfh"></sub>
                <sub id="dvpfh"><var id="dvpfh"><ins id="dvpfh"></ins></var></sub><address id="dvpfh"><dfn id="dvpfh"><mark id="dvpfh"></mark></dfn></address>

                  <sub id="dvpfh"><dfn id="dvpfh"><mark id="dvpfh"></mark></dfn></sub>

                      <address id="dvpfh"></address>

                      <sub id="dvpfh"><var id="dvpfh"><mark id="dvpfh"></mark></var></sub>

                      <thead id="dvpfh"><var id="dvpfh"><ins id="dvpfh"></ins></var></thead>

                      <address id="dvpfh"><dfn id="dvpfh"></dfn></address>
                        <address id="dvpfh"></address>
                          <address id="dvpfh"><var id="dvpfh"></var></address>
                            <address id="dvpfh"></address>

                          在线娱乐赌博平台排名

                          不限id白菜网送体验金

                          众赢/首页

                          www.京都棋牌.com

                          www.凯发

                          信的彩的网址首页

                          bf必发彩票官方

                          富鱼彩票登录

                          中国福彩app下载安装