Entries tagged with “ZDNet”.


This is a guest post from Larry Dignan, Editor in Chief of ZDNet, TechRepublic’s sister site. You can follow Larry on his ZDNet blog Between the Lines (or subscribe to the RSS feed).

Cisco’s Unified Computing System is garnering interest, but storage appears to be the focus of CIOs as they ponder the next generation data center and that’s good news for EMC and NetApp, according to a Goldman Sachs survey.

Goldman Sachs surveyed 100 IT executives at Fortune 1000 companies to get a read on their data center plans two to three years from now.

Among the takeaways:

Cisco’s Unified Computing System (UCS) has found “a surprisingly receptive ear,” according to Goldman Sachs. Indeed, 18 percent are planning to evaluate Cisco’s UCS in the next 12 months, an impressive figure for a product that was announced a few weeks ago. Another two-thirds of IT execs say that they expect Cisco have a larger server presence over the next 2 to 3 years.

Among those surveyed, 18 percent said they will evaluate UCS in the next 12 months, 44 percent said no and 38 percent were unsure.

Cisco, HP and Dell were vendors expected to increase data center share, according to respondents. Sun and IBM are seen decreasing.

These charts tell the tale:

And.

The next gen data center push is benefiting pure storage players. EMC and NetApp are seen gaining share in the next-gen data center. A key point: As tech giants try to further integrate hardware and software independent storage vendors NetApp and EMC are benefiting. Why? These vendors work with any architecture and they’re ahead on storage virtualization.

VMware is seen as the most strategic software vendor, but Microsoft has a better-than-expected finish. Meanwhile, Oracle got a mention as being strategic on the virtualization front.

The standings:

Cisco and Juniper defend switching turf. Goldman Sachs notes:

Despite the heightened activity in data center networking, including the launch of Juniper’s new high-end switching platform as well as HP’s ProCurve partner ecosystem, Cisco is expected to further extend its already sizable lead in the long-term. This is consistent with our IT Survey’s results pointing to share gains in the near term. Juniper also appears to be gaining traction in switching as our survey points to the company increasing its presence in the data center, with nearly 70% of the respondents citing share gains over next 2-3 years.

More reading:

Reblog this post [with Zemanta]

Hewlett-Packard Company
Image via Wikipedia

Top HP software execs say that, while the computing models represented by ‘the cloud’ are important, they don’t like the name or associated hype. Top HP software executives said on Tuesday that they believe in the ideas behind cloud computing, but don’t like the name of the approach or the “hype” surrounding it.

Talking at the HP Software Universe show and conference in Vienna, Tom Hogan, senior vice president for software at HP, said the company had taken time to weigh up the promise of cloud computing, which provides web-based access to remote enterprise applications and storage.

“Rather than jump in to the hype [around cloud computing] out of the gate — you can’t pick up a newspaper or a technology magazine today without reading about the cloud — we have been very deliberate over the past nine months, assessing where we think the cloud can help us”, Hogan said.

The result of that period of assessment, Hogan told ZDNet UK, was the conclusion that “just like a lot of things in technology, the cloud will not be a panacea”.

Several major technology companies have announced cloud-computing moves recently. These include Microsoft, which launched Azure, a cloud extension to its Windows franchise, Salesforce.com, Amazon and Google.

Hogan said that there will be a place for the cloud. Customers will be able to have a channel strategy for services, somewhat like the channel strategy they have for sales and marketing, he said.

According to Hogan, that means there will be three operations approaches open to enterprises: traditional in-house; outsourced; and in the cloud. “You have a host of applications that you will want to run on-premise in the traditional manner; there will be services for outsourcing (for which we have EDS); and there will be an emerging new paradigm that will aim to capitalize on the cloud,” he said.

Within that context, the cloud is important, and HP has the tools to exploit it, Hogan said. “We think that HP has more capability in fulfilling the promise for the enterprise cloud, which is a heritage strength for HP,” he said.

HP is especially well-equipped to do this, as its EDS business group can provide the processes needed for the cloud model, Hogan said. The company acquired EDS for $13.9bn (£9.37bn) in May, adding EDS’s computer-services expertise to its portfolio. Hogan’s skepticism about the hype was echoed by other executives at the conference. “A lot of people are jumping on the bandwagon of cloud, but I have not heard two people say the same thing about it,” said Andy Isherwood, HP’s vice president for software services in Europe. “There are multiple definitions out there of ‘the cloud’.”

According to Isherwood, HP prefers to talk about the software-as-a-service (SaaS) model. Isherwood told ZDNet UK: “Customers say: ‘We want solutions from you that we can buy and implement quickly. And we want to do that without investing a lot of our capital in people, equipment and software. We buy SaaS and, if it works: great. We will keep it.’”

HP has become the 10th largest company in the global SaaS market, according to Isherwood.

Reblog this post [with Zemanta]

View of Wall Street, Manhattan.
Image via Wikipedia

Despite the bad times, businesses still require their IT to help them stay ahead of the competition by offering customers attractive and innovative products with the service levels they have come to expect says OpTier’s Motti Tal.

By Motti Tal, OpTier, Special to ZDNet
Posted on ZDNet News: Dec 12, 2008 11:11:23 AM

Commentary–The current turmoil in the economy and the projected near and medium term downturn have immediate effects on the way we manage enterprise IT. Businesses across the globe are taking action to reduce cost and improve efficiencies. IT is taking a big hit, and the challenge of effectively managing IT with reduced headcount and budgets is growing. Uncertainty is limiting business and IT from knowing they can truly prepare for future demand, the crisis in the capital markets industry is an important example at how the current volatility was very difficult to handle business wise and a formidable challenge for IT.

Furthermore, increased M&A activity is driving intensified consolidation and integration requirements. And, as scrutiny increases around every dollar spent on IT cost reduction initiatives increase the importance of ongoing consolidation and shared services initiatives. At the same time, businesses still require their IT to help them stay ahead of the competition by offering customers attractive and innovative products with the service levels they have come to expect.

Business Transaction Management (BTM) technology addresses these critical needs.

BTM technology is the most effective technology available for assuring service levels are met, outages are avoided and IT resources are utilized in accordance with business priorities. It is being successfully used by leading organizations, and is especially helpful for those currently facing economic challenges.

By using BTM technology, organizations are boosting business activity using existing resources, improving the efficiency of IT management and driving down the cost of ownership for application and system management tools.

Boost business activity using existing resources
Using BTM complete and continuous business transaction visibility is gained across all tiers in the infrastructure, in real time. This visibility allows the direct alignment of system resources with the most important business activities.

For example, some business transactions can really be infrequently used but hogging computing resources. Using a BTM solution these transactions can be easily identified and optimized to free up valuable resources for more important business generating activities.

Transactions that are rarely used and do not contribute to business success can be decommissioned altogether, further freeing up resources. Any resource heavy transactions are re-tuned to assure optimal usage, increasing the ROI of the existing infrastructure and deferring hardware expenses.

Avoid outages and improve IT management efficiency
Outages are expensive and are damaging to ones business. Transactions are the most effective early warning sensors indicating impending outages before they occur, allowing time and providing insight to address them.

Take a large UK-based bank for example. In detecting a developing service disruption and a potential outage it had discovered that an alert based on BTM technology showed up before any of its more traditional system and application monitors by as much as four hours. These extra four hours gave staff enough time to attend to the application, assure business transactions continued to flow smoothly through the system and avoid an outage. This directly contributed to business results and customer satisfaction. Many peer organizations have reported similar experiences.

BTM is also the most effective method of isolating the cause of performance problems. It does so without requiring multiple experts to co-operate or join an “all hands call”. BTM assures only necessary people are called to attend to performance issues. It provides a full transaction execution record coupled with a cross tier performance breakdown view freeing up resources and allowing an IT department to operate effectively, even when headcounts are down.

Using BTM technology also frees up costly man hours spent building reports for the business and IT management. Instead of having to excavate data from multiple sources and manually create reports information depicting service levels, resources and business usage trends is readily available from a single solution at the press of a button.

Reduce total cost of ownership (TCO) of applications, servers and management tools
Operational expenditure and capital expenditures combined make up an IT department’s TCO. We have shown above how BTM will help you save on capital expenditures in deferred hardware purchases and operational expenditures in better utilization of expert times and reduced outages. However, TCO also consist of the cost of licenses for supporting software tools. The use of these tools can be scaled down with the confidence in knowing that transactions are being monitored with the necessary visibility. Significant savings for organizations adopting BTM are being driven by the decommissioning of monitoring agents and the appropriate use of technical deep dive tools.

Expedite the adoption of shared services
Shared IT services present a significant value proposition for businesses cutting down on costs. This is why so many businesses are building them and making efforts to move as many infrastructure, application and service functions as possible onto them.

BTM is unique in its ability to provide end-to-end visibility that extends from application front ends across the shared environment and further down the execution chain to external providers. It gives application owners the same level of control and confidence in their application on the shared environment that they would have on a dedicated system. This is an important contributing factor to an organizations rate of shared services adoption.

BTM helps companies assure they can make it past these hard times and come out on the other side with better and stronger capabilities in IT and business.

biography
Motti Tal is a founder of OpTier and serves as its executive vice president of marketing, product and business development.

Reblog this post [with Zemanta]