The Office Service Pack team recently announced that Service Pack 2 for Microsoft Office SharePoint Server 2007 and Windows SharePoint Services 3.0 has been released to the Microsoft Download Center. It includes all the fixes prior to SP2, and also several enhancements to improve server farm performance, availability and stability.
The service pack install involves certain pre-installation steps and a prescribed sequence for the actual installation. Please review this process carefully before updating your environments.
Service Pack 2 for SharePoint 2007 provides many fixes and performance improvements, as well as some new features. Also with the new service pack, Internet Explorer 8 will be added into browser support matrix as level one, and Firefox 2.0 and 3.0 as level two. We recommend every customer to start planning their patch window to apply this service pack.
- Ji Lie, Microsoft SharePoint Team
Updates Resource Center for SharePoint Products and Technologies:
http://technet.microsoft.com/en-us/office/sharepointserver/bb735839.aspx
Thursday, April 30, 2009
Wednesday, April 29, 2009
Creating an Information Technology Strategic Plan
The CIO or IT Director is responsible for creating an information technology (IT) vision. This document is the IT strategic plan. So what is an IT strategic plan? Let’s start with strategy from Random House Dictionary:
Strategy:
a plan, method, or series of maneuvers or stratagems for obtaining a specific goal or result: a strategy for getting ahead in the world.
Origin:
1680–90; Gk stratēgía generalship, equiv. to stratēg(ós) military commander, general
In business, the strategic plan is the overall long range goal. The plan includes tactics that will be implemented to attain the goals. To create an IT strategic plan start by assessing where you are today and where you want to go over the next 1 -3 years – It’s hard to go much further out for information technology. Then create the document outlining what your information technology will look like in the future. Create realistic goals within the budgets because this plan is your method to communicate to the CEO and CFO. This document will demonstrate that you have a plan and know how IT will benefit the business. If it is a compelling strategic plan you should be able to ensure the necessary budget.
This is not War and Peace. Aim for 15 pages, says Gartner VP Dave Aron, who saw one IT plan weigh in at 250 pages. Consider PowerPoint instead of Word as your medium of choice, says Cullen. It fosters brevity. And limit it to 25 slides. – Stephanie Overby, CIO article link: http://tinyurl.com/cx5c3r
Once you have created your IT strategic Plan, present it to your CEO, CFO and your staff. Then provided you get staff and management to buy in to the plan, magically, everyone starts doing things that will progress you toward the long term goals!
If you do not create the plan and goals how can you possibly get there?
Good luck and keep the passion for information technology.
Strategy:
a plan, method, or series of maneuvers or stratagems for obtaining a specific goal or result: a strategy for getting ahead in the world.
Origin:
1680–90; Gk stratēgía generalship, equiv. to stratēg(ós) military commander, general
In business, the strategic plan is the overall long range goal. The plan includes tactics that will be implemented to attain the goals. To create an IT strategic plan start by assessing where you are today and where you want to go over the next 1 -3 years – It’s hard to go much further out for information technology. Then create the document outlining what your information technology will look like in the future. Create realistic goals within the budgets because this plan is your method to communicate to the CEO and CFO. This document will demonstrate that you have a plan and know how IT will benefit the business. If it is a compelling strategic plan you should be able to ensure the necessary budget.
This is not War and Peace. Aim for 15 pages, says Gartner VP Dave Aron, who saw one IT plan weigh in at 250 pages. Consider PowerPoint instead of Word as your medium of choice, says Cullen. It fosters brevity. And limit it to 25 slides. – Stephanie Overby, CIO article link: http://tinyurl.com/cx5c3r
Once you have created your IT strategic Plan, present it to your CEO, CFO and your staff. Then provided you get staff and management to buy in to the plan, magically, everyone starts doing things that will progress you toward the long term goals!
If you do not create the plan and goals how can you possibly get there?
Good luck and keep the passion for information technology.
Monday, April 27, 2009
Exchange 2010
Microsoft released Beta version of Exchange 2010 and so far there has been positive feedback.
"I found that the latest version (Exchange 2010) empowers administrators by considerably simplifying many tasks they're likely to face. " Mario Morejon PCMag.com
"Now Exchange 2010 brings much more balanced improvements through new features. It is a major upgrade with a new face to it." Christopher Voce, an analyst with Forrester
Here is the technical data on the new product and link to download if you want to load it up in your lab:
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=1898ed2c-2f88-48ac-824e-d3d20fad77d7
Overview
Microsoft Exchange Server 2010 Beta helps IT Professionals achieve new levels of reliability with greater flexibility, enhanced user experiences, and increased protection for business communications.
Flexible and reliable - Exchange Server 2010 gives you the flexibility to tailor your deployment based on your company's unique needs and a simplified way to keep e-mail continuously available for your users.
Anywhere access - Exchange Server 2010 helps your users get more done by giving them the freedom to securely access all their communications - e-mail, voice mail, instant messaging, and more - from virtually any platform, Web browser, or device.
Protection and compliance - Exchange Server 2010 delivers integrated information loss prevention, and compliance tools aimed at helping you simplify the process of protecting your company's communications and meeting regulatory requirements.This software is intended for evaluation purposes only. You must accept the license terms before you are authorized to use this software. There is no product support for this trial software. You are welcome to participate in the forums to share your trial experiences with others and to ask for advice.
Sytem Requirements
Supported Operating Systems: Windows Server 2008; Windows Vista 64-bit Editions Service Pack 1
Operating System for Installing Management Tools: The 64-bit editions of Windows Vista® SP1 or later, or Windows Server® 2008.
PC - x64 architecture-based computer with Intel processor that supports Intel 64 architecture (formerly known as Intel EM64T) or AMD processor that supports the AMD64 platformAdditional requirements to run Exchange Server 2010 Beta
Memory - Minimum of 4 gigabytes (GB) of RAM per server plus 5 megabytes (MB) of RAM recommended for each mailbox
Disk space
At least 1.2 GB on the drive used for installation
An additional 500 MB of available disk space for each Unified Messaging (UM) language pack that you plan to install
200 MB of available disk space on the system drive
Drive - DVD-ROM drive, local or network accessible
File format - Disk partitions formatted as NTFS file systems
Monitor – Screen resolution 800 x 600 pixels or higherExchange Server 2010 Beta
Prerequisites
If these required prerequisites are not already installed, the Exchange Server 2010 Beta setup process will prompt and provide links to the installation locations; Internet access will be required if the prerequisites are not already installed or available on a local network.
Microsoft .NET Framework 3.5
Windows PowerShell v2
Windows Remote Management
"I found that the latest version (Exchange 2010) empowers administrators by considerably simplifying many tasks they're likely to face. " Mario Morejon PCMag.com
"Now Exchange 2010 brings much more balanced improvements through new features. It is a major upgrade with a new face to it." Christopher Voce, an analyst with Forrester
Here is the technical data on the new product and link to download if you want to load it up in your lab:
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=1898ed2c-2f88-48ac-824e-d3d20fad77d7
Overview
Microsoft Exchange Server 2010 Beta helps IT Professionals achieve new levels of reliability with greater flexibility, enhanced user experiences, and increased protection for business communications.
Flexible and reliable - Exchange Server 2010 gives you the flexibility to tailor your deployment based on your company's unique needs and a simplified way to keep e-mail continuously available for your users.
Anywhere access - Exchange Server 2010 helps your users get more done by giving them the freedom to securely access all their communications - e-mail, voice mail, instant messaging, and more - from virtually any platform, Web browser, or device.
Protection and compliance - Exchange Server 2010 delivers integrated information loss prevention, and compliance tools aimed at helping you simplify the process of protecting your company's communications and meeting regulatory requirements.This software is intended for evaluation purposes only. You must accept the license terms before you are authorized to use this software. There is no product support for this trial software. You are welcome to participate in the forums to share your trial experiences with others and to ask for advice.
Sytem Requirements
Supported Operating Systems: Windows Server 2008; Windows Vista 64-bit Editions Service Pack 1
Operating System for Installing Management Tools: The 64-bit editions of Windows Vista® SP1 or later, or Windows Server® 2008.
PC - x64 architecture-based computer with Intel processor that supports Intel 64 architecture (formerly known as Intel EM64T) or AMD processor that supports the AMD64 platformAdditional requirements to run Exchange Server 2010 Beta
Memory - Minimum of 4 gigabytes (GB) of RAM per server plus 5 megabytes (MB) of RAM recommended for each mailbox
Disk space
At least 1.2 GB on the drive used for installation
An additional 500 MB of available disk space for each Unified Messaging (UM) language pack that you plan to install
200 MB of available disk space on the system drive
Drive - DVD-ROM drive, local or network accessible
File format - Disk partitions formatted as NTFS file systems
Monitor – Screen resolution 800 x 600 pixels or higherExchange Server 2010 Beta
Prerequisites
If these required prerequisites are not already installed, the Exchange Server 2010 Beta setup process will prompt and provide links to the installation locations; Internet access will be required if the prerequisites are not already installed or available on a local network.
Microsoft .NET Framework 3.5
Windows PowerShell v2
Windows Remote Management
Friday, April 24, 2009
Quote of the day:
On Cloud Computing:
"That is exciting to me as a network player. Boy am I going to sell a lot of stuff to tie that together." However, he added, "It is a security nightmare and it can't be handled in traditional ways."
- John Chambers, Chairman and CEO of Cisco speaking at Security confab on Wednesday 4/22
PCWorld Article:
http://www.pcworld.com/businesscenter/article/163681/cloud_computing_a_security_nightmare_says_cisco_ceo.html
"That is exciting to me as a network player. Boy am I going to sell a lot of stuff to tie that together." However, he added, "It is a security nightmare and it can't be handled in traditional ways."
- John Chambers, Chairman and CEO of Cisco speaking at Security confab on Wednesday 4/22
PCWorld Article:
http://www.pcworld.com/businesscenter/article/163681/cloud_computing_a_security_nightmare_says_cisco_ceo.html
Building Block Architecture for Superior Performance
by Scott Drummonds at VMWare
If any of you have heard me speak in the numerous events I've done in the past two years, you may have heard me detail the areas where virtualization performance can exceed native. There are scalability limitations in traditional software that make nearly every enterprise application fall short of utilizing the cores that are available to them today. As the core explosion continues, this under-utilization of processors will worsen.
In 2008, I visited VMworld Europe and showed on using multiple virtual machines on a single physical host could circumvent the limitations in today's software. In that experiment we showed that 16,000 Exchange mailboxes could be fit on a single physical server when no one had ever put more than 8,000 on in a single native instance. We called this approach designing by "building blocks" and were confident that as the core count continued to increase, we'd continue to expose more applications whose performance could be improved through virtualization.
On Thursday last week SPEC accepted VMware's submission of a SPECweb2005 result. And last night we posted an article on VROOM! detailing the experiment and providing information on the submission. This submission is an incredible first for us: not only have we shown that we can circumvent limitations in web servers, but we posted a world record performance number in the process. Of course, if any of you have seen Sreekanth Setty's presentation at VMworld on his ongoing work on SPECweb2005, this result wouldn't surprise you:Getting a benchmark standardization body like SPEC to approve these results isn't always easy. Most of the industry remains stuck in a mode of thinking of performance as a single instance's maximum throughput.
But given the scale-out capabilities of a large number of enterprise applications I'd argue that benchmarking should account for scale-out capabilities on a single box. VMware's customers follow this practice faithfully in sizing their deployments to match their needs and everyone wants to know the platform's ability to handle this use-case. SPEC's willingness to accept results showing building blocks on a single host is commendable and progressive. As more benchmarks approve submissions like these VMware will continue to be able to show record numbers.
If any of you have heard me speak in the numerous events I've done in the past two years, you may have heard me detail the areas where virtualization performance can exceed native. There are scalability limitations in traditional software that make nearly every enterprise application fall short of utilizing the cores that are available to them today. As the core explosion continues, this under-utilization of processors will worsen.
In 2008, I visited VMworld Europe and showed on using multiple virtual machines on a single physical host could circumvent the limitations in today's software. In that experiment we showed that 16,000 Exchange mailboxes could be fit on a single physical server when no one had ever put more than 8,000 on in a single native instance. We called this approach designing by "building blocks" and were confident that as the core count continued to increase, we'd continue to expose more applications whose performance could be improved through virtualization.
On Thursday last week SPEC accepted VMware's submission of a SPECweb2005 result. And last night we posted an article on VROOM! detailing the experiment and providing information on the submission. This submission is an incredible first for us: not only have we shown that we can circumvent limitations in web servers, but we posted a world record performance number in the process. Of course, if any of you have seen Sreekanth Setty's presentation at VMworld on his ongoing work on SPECweb2005, this result wouldn't surprise you:Getting a benchmark standardization body like SPEC to approve these results isn't always easy. Most of the industry remains stuck in a mode of thinking of performance as a single instance's maximum throughput.
But given the scale-out capabilities of a large number of enterprise applications I'd argue that benchmarking should account for scale-out capabilities on a single box. VMware's customers follow this practice faithfully in sizing their deployments to match their needs and everyone wants to know the platform's ability to handle this use-case. SPEC's willingness to accept results showing building blocks on a single host is commendable and progressive. As more benchmarks approve submissions like these VMware will continue to be able to show record numbers.
Thursday, April 23, 2009
New Release of Microsoft Exchange Server User Monitor (ExMon) - Free Download
Microsoft just released new version of Exchange Server Monitor (ExMon). ExMon allows administrators to view and evaluate individual users usage and experience with Microsoft Exchange Server: ExMon for Exchange 2007 SP 1. The correct version is 14.00.0553.004. If this does not come up you may need to refresh page.
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=9a49c22e-e0c7-4b7c-acef-729d48af7bc9
The Microsoft® Exchange Server User Monitor (ExMon) tool enables administrators to view and evaluate individual users' usage and experience with Microsoft Exchange Server. With this tool, administrators can gather real-time data that helps them better understand current client usage patterns and plan for future use.
Using ExMon, administrators can view the following:
IP addresses used by clients
Microsoft Office Outlook® versions and mode, such as Cached Exchange Mode and classic online
mode
Outlook client-side monitoring data
Resource use, such as:
CPU usage
Server-side processor latency
Total latency for network and processing with Outlook 2003 and later versions of MAPI
Network bytes
Frequently Asked Questions
Q: How much disk space is required for ExMon data collection?
A: File size depends on the Exchange server load. You can estimate required file size by looking at the Perfmon counter, MSExchangeIS\RPC Operations\sec, as the file size per hour. For example, a server that has an average RPC Operations\sec of 300 requires 300 MB per hour of free space for ExMon data collection.
Q: How long should I collect ExMon data?
A: Tracing time depends on user activity and how you want to use the data. For good averages across all users, it is recommended that you collect data for at least 30 minutes during a period of expected user activity. Some client monitoring data is collected only at certain intervals. Therefore, collecting data for longer may increase the probability of more complete data. When you troubleshoot individual users and problems, traces of one to five minutes are generally sufficient.
Q: Does ExMon support non-English languages of Exchange Server 2003 or later versions and the Windows operating system?
A: Yes. ExMon can be run with any language that is supported by Exchange Server and any language that is supported by Windows. ExMon supports Unicode display names for users. However, the ExMon tool interface and documentation are available only in English.
Q: How does ExMon data collection affect Exchange server performance?
A: The effect of data collection on Exchange Server is less than a two percent increase in CPU or latency. To minimize the effect, you should not collect data on a hard disk drive that is currently being used by Exchange, such as the database, streaming, log file, or queue drives. Also note that ExMon tracing uses a Windows technology known as Event Tracing for Windows (ETW). ETW was designed especially for performance tracing and is used by core parts of Windows. As a result, the effect on the server is less than two percent additional processing time and a negligible additional latency.
Q: Because ExMon data is collected with ETW, can I write my own data parser?
A: No, you currently cannot write your own data parser. The raw data requires a significant amount of analysis to produce meaningful data.
Q: Why does ExMon display only part of a user’s display name?
A: Because of limitations in the tracing and parsing code, ExMon truncates user display names to 32 characters.
Q: Why are some data columns blank?
A: Some data columns are blank because some servers do not provide some information. ExMon can view data in Exchange Server 2000 SP2 and later versions, in Exchange Server 2003 SP1 and later versions, and in Exchange Server 2007 SP1 and later versions. Since the release of Exchange Server 2000, significant changes have been made. ExMon supports data files from all these servers, although not all the data is available. For example, the Foreground Latency column in the By Clientmon view requires Exchange Server 2003 SP1. It also requires that users also have Outlook 2003 SP1.
Q: How can I collect data on Exchange that is running on Clustering Services for Microsoft Windows?
A: Tracing ExMon data on Exchange servers that use Cluster Service is difficult because you care about collecting data for a specific virtual server instead of data from just a physical node. A cluster failover during a data collection session causes incomplete data. By collecting on shorter intervals, such as five minute intervals, on every node of the cluster, you can minimize the amount of data that is lost if there is a failover. Both System Monitor and Tracelog.exe provide functionality to create intervals based on file size instead of time. You can also write a script to run on cluster failovers, and start and stop the appropriate data collections.
Q: Why doesn't ExMon display data when I have passed in an input file?
A: You may be able to resolve this issue by performing the following tasks:
Make sure that you put the path and file name in double quotation marks if the path or file name contains a space.
ExMon must run on Windows Server 2003 or later versions if the Event Trace Log (.etl) file was collected on Windows Server 2003 or later versions.
Verify that the Exmon.reg file was applied before you began collecting data. For instructions on how to apply the Exmon.reg file, see "Installation" earlier in this document.
Source: Microsoft
http://www.microsoft.com/downloads/details.aspx?displaylang=en&FamilyID=9a49c22e-e0c7-4b7c-acef-729d48af7bc9
The Microsoft® Exchange Server User Monitor (ExMon) tool enables administrators to view and evaluate individual users' usage and experience with Microsoft Exchange Server. With this tool, administrators can gather real-time data that helps them better understand current client usage patterns and plan for future use.
Using ExMon, administrators can view the following:
IP addresses used by clients
Microsoft Office Outlook® versions and mode, such as Cached Exchange Mode and classic online
mode
Outlook client-side monitoring data
Resource use, such as:
CPU usage
Server-side processor latency
Total latency for network and processing with Outlook 2003 and later versions of MAPI
Network bytes
Frequently Asked Questions
Q: How much disk space is required for ExMon data collection?
A: File size depends on the Exchange server load. You can estimate required file size by looking at the Perfmon counter, MSExchangeIS\RPC Operations\sec, as the file size per hour. For example, a server that has an average RPC Operations\sec of 300 requires 300 MB per hour of free space for ExMon data collection.
Q: How long should I collect ExMon data?
A: Tracing time depends on user activity and how you want to use the data. For good averages across all users, it is recommended that you collect data for at least 30 minutes during a period of expected user activity. Some client monitoring data is collected only at certain intervals. Therefore, collecting data for longer may increase the probability of more complete data. When you troubleshoot individual users and problems, traces of one to five minutes are generally sufficient.
Q: Does ExMon support non-English languages of Exchange Server 2003 or later versions and the Windows operating system?
A: Yes. ExMon can be run with any language that is supported by Exchange Server and any language that is supported by Windows. ExMon supports Unicode display names for users. However, the ExMon tool interface and documentation are available only in English.
Q: How does ExMon data collection affect Exchange server performance?
A: The effect of data collection on Exchange Server is less than a two percent increase in CPU or latency. To minimize the effect, you should not collect data on a hard disk drive that is currently being used by Exchange, such as the database, streaming, log file, or queue drives. Also note that ExMon tracing uses a Windows technology known as Event Tracing for Windows (ETW). ETW was designed especially for performance tracing and is used by core parts of Windows. As a result, the effect on the server is less than two percent additional processing time and a negligible additional latency.
Q: Because ExMon data is collected with ETW, can I write my own data parser?
A: No, you currently cannot write your own data parser. The raw data requires a significant amount of analysis to produce meaningful data.
Q: Why does ExMon display only part of a user’s display name?
A: Because of limitations in the tracing and parsing code, ExMon truncates user display names to 32 characters.
Q: Why are some data columns blank?
A: Some data columns are blank because some servers do not provide some information. ExMon can view data in Exchange Server 2000 SP2 and later versions, in Exchange Server 2003 SP1 and later versions, and in Exchange Server 2007 SP1 and later versions. Since the release of Exchange Server 2000, significant changes have been made. ExMon supports data files from all these servers, although not all the data is available. For example, the Foreground Latency column in the By Clientmon view requires Exchange Server 2003 SP1. It also requires that users also have Outlook 2003 SP1.
Q: How can I collect data on Exchange that is running on Clustering Services for Microsoft Windows?
A: Tracing ExMon data on Exchange servers that use Cluster Service is difficult because you care about collecting data for a specific virtual server instead of data from just a physical node. A cluster failover during a data collection session causes incomplete data. By collecting on shorter intervals, such as five minute intervals, on every node of the cluster, you can minimize the amount of data that is lost if there is a failover. Both System Monitor and Tracelog.exe provide functionality to create intervals based on file size instead of time. You can also write a script to run on cluster failovers, and start and stop the appropriate data collections.
Q: Why doesn't ExMon display data when I have passed in an input file?
A: You may be able to resolve this issue by performing the following tasks:
Make sure that you put the path and file name in double quotation marks if the path or file name contains a space.
ExMon must run on Windows Server 2003 or later versions if the Event Trace Log (.etl) file was collected on Windows Server 2003 or later versions.
Verify that the Exmon.reg file was applied before you began collecting data. For instructions on how to apply the Exmon.reg file, see "Installation" earlier in this document.
Source: Microsoft
Tuesday, April 21, 2009
Social Media and Twitter
I have continued looking at social media and recently Twitter. I have been trying to understand the value of the service. The interesting thing is the broad usage of the service and the classic challenge of trying to understand what it is and can do for you. My thoughts are that it is very cool but can be a time sink.
Twitter is a micro blog site. Anyone can sign up and start micro blogging 140 characters in each blog. You can write whatever you want whenever you want and many people do exactly that! You can follow whomever you want and anyone can follow you. There is a block feature but this is really just to block someone that is abusing the system by saying inappropriate comments.
What I like about Twitter is not as much posting personally but following leading thinkers. I look forward to growing my list! I selected several book authors, speakers, business leaders and also linked to Senior IT #v12n, #vsphere (those are hashtags for virtualization) technical professionals that I read something and thought it was interesting. You can go to people you respect and see who they are following – this is a good source of who you may also want to follow. I added in the top celebrities and major news feeds. Then you just read every once in a while to get the latest information and tidbits. Occasionally, you will have someone that just keeps posting information that you are not interested in so you just stop following them. You can also search on hashtags to find information.
So this is how I use Twitter so far and I am neophyte - In either case I think it is an interesting social media and worth taking a look at. I am not the world authority so below I have put a link to a Guy Kawasaki article on how to demo Twitter:
http://blogs.openforum.com/2009/04/19/how-to-demo-twitter/
Twitter is a micro blog site. Anyone can sign up and start micro blogging 140 characters in each blog. You can write whatever you want whenever you want and many people do exactly that! You can follow whomever you want and anyone can follow you. There is a block feature but this is really just to block someone that is abusing the system by saying inappropriate comments.
What I like about Twitter is not as much posting personally but following leading thinkers. I look forward to growing my list! I selected several book authors, speakers, business leaders and also linked to Senior IT #v12n, #vsphere (those are hashtags for virtualization) technical professionals that I read something and thought it was interesting. You can go to people you respect and see who they are following – this is a good source of who you may also want to follow. I added in the top celebrities and major news feeds. Then you just read every once in a while to get the latest information and tidbits. Occasionally, you will have someone that just keeps posting information that you are not interested in so you just stop following them. You can also search on hashtags to find information.
So this is how I use Twitter so far and I am neophyte - In either case I think it is an interesting social media and worth taking a look at. I am not the world authority so below I have put a link to a Guy Kawasaki article on how to demo Twitter:
http://blogs.openforum.com/2009/04/19/how-to-demo-twitter/
Friday, April 17, 2009
Best Practices for Configuring Virtual Storage
When virtualizing your file servers, the limiting factors are usually the storage I/O utilization and the network utilization. The maximum reported IOPS using VMWare is 102,240 and the maximum reported network performance is 16Gbps.
Here are Best practices for configuring virtual storage from VMWare Performance Team:
Many of the best practices for physical storage environments also apply to virtual storage environments. It is best to keep in mind the following rules of thumb when configuring your virtual storage infrastructure:
Configure and size storage resources for optimal I/O performance first, then for storage capacity.
This means that you should consider throughput capability and not just capacity. Imagine a very large parking lot with only one lane of traffic for an exit. Regardless of capacity, throughput is affected. It’s critical to take into consideration the size and storage resources necessary to handle your volume of traffic—as well as the total capacity.
Aggregate application I/O requirements for the environment and size them accordingly.
As you consolidate multiple workloads onto a set of ESX servers that have a shared pool of storage, don’t exceed the total throughput capacity of that storage resource. Looking at the throughput characterization of physical environment prior to virtualization can help you predict what throughput each workload will generate in the virtual environment.
Base your storage choices on your I/O workload.
Use an aggregation of the measured workload to determine what protocol, redundancy protection and array features to use, rather than using an estimate. The best results come from measuring your applications I/O throughput and capacity for a period of several days prior to moving them to a virtualized environment.
Remember that pooling storage resources increases utilization and simplifies management, but can lead to contention.
There are significant benefits to pooling storage resources, including increased storage resource utilization and ease of management. However, at times, heavy workloads can have an impact on performance. It’s a good idea to use a shared VMFS volume for most virtual disks, but consider placing heavy I/O virtual disks on a dedicated VMFS volume or an RDM to reduce the effects of contention.
Source: VMWare Performance Team
Here are Best practices for configuring virtual storage from VMWare Performance Team:
Many of the best practices for physical storage environments also apply to virtual storage environments. It is best to keep in mind the following rules of thumb when configuring your virtual storage infrastructure:
Configure and size storage resources for optimal I/O performance first, then for storage capacity.
This means that you should consider throughput capability and not just capacity. Imagine a very large parking lot with only one lane of traffic for an exit. Regardless of capacity, throughput is affected. It’s critical to take into consideration the size and storage resources necessary to handle your volume of traffic—as well as the total capacity.
Aggregate application I/O requirements for the environment and size them accordingly.
As you consolidate multiple workloads onto a set of ESX servers that have a shared pool of storage, don’t exceed the total throughput capacity of that storage resource. Looking at the throughput characterization of physical environment prior to virtualization can help you predict what throughput each workload will generate in the virtual environment.
Base your storage choices on your I/O workload.
Use an aggregation of the measured workload to determine what protocol, redundancy protection and array features to use, rather than using an estimate. The best results come from measuring your applications I/O throughput and capacity for a period of several days prior to moving them to a virtualized environment.
Remember that pooling storage resources increases utilization and simplifies management, but can lead to contention.
There are significant benefits to pooling storage resources, including increased storage resource utilization and ease of management. However, at times, heavy workloads can have an impact on performance. It’s a good idea to use a shared VMFS volume for most virtual disks, but consider placing heavy I/O virtual disks on a dedicated VMFS volume or an RDM to reduce the effects of contention.
Source: VMWare Performance Team
Wednesday, April 15, 2009
Virtual Server Awareness Products are Being Introduced to the Network Infrastructure
A challenge when supporting a virtual server environment is created around the visibility from the network to the machines. The traditional switches and routers only see the physical hardware and are typically unaware of the Virtual Servers. There are three issues the switches and routers have as a result:
1. No knowledge of communications between two virtual machines (VMs) on the same server.
2. No knowledge of a VMs moving from one server to another using apps such as VMotion.
3. Higher density of activity on physical servers due to multiple VMs.
This is fine, however it should be taken into consideration when designing the switching and routing infrastructure and it can cause challenges when you are trying to trouble shoot a problem on the network.
Yesterday I posted an article on third party products supporting virtualization. Cisco's Nexus 1000v is one of these third party virtualization products used to address the visibility to virtual servers.
Here is Han Yang of Cisco explaining the product in a video:
http://www.cisco.com/en/US/products/ps9902/index.html
1. No knowledge of communications between two virtual machines (VMs) on the same server.
2. No knowledge of a VMs moving from one server to another using apps such as VMotion.
3. Higher density of activity on physical servers due to multiple VMs.
This is fine, however it should be taken into consideration when designing the switching and routing infrastructure and it can cause challenges when you are trying to trouble shoot a problem on the network.
Yesterday I posted an article on third party products supporting virtualization. Cisco's Nexus 1000v is one of these third party virtualization products used to address the visibility to virtual servers.
Here is Han Yang of Cisco explaining the product in a video:
http://www.cisco.com/en/US/products/ps9902/index.html
Tuesday, April 14, 2009
Growth of Virtualization Support Products
Virtualization is moving to the next phase of integration with support products - third party backup, management and security software. Many of these much needed products are in early stages so this creates challenges around interoperability and the usual challenges of first release products.
Virtualization is so compelling from a speed of launching VMs to cost savings by running less physical servers that most business are forging forward in the face of these challenges. The extent of these challenges is questionable - virtualization makes the servers easier to manage in most cases and security may improve with virtualization as more VMs can create more information silos.
In larger organizations the management software integration issue has a greater impact as large scale management solutions are more complex to integrate. Microsoft has done a good job of integrating the System Center product suite right into the management of Hyper-V right out of the gate so this makes it pretty straight forward for an organization that is a homogeneous Microsoft environment.
In either case, it is great to see the growth of the needed support products for virtualization and this will continue to drive the success of virtualization solutions in all businesses.
Virtualization is so compelling from a speed of launching VMs to cost savings by running less physical servers that most business are forging forward in the face of these challenges. The extent of these challenges is questionable - virtualization makes the servers easier to manage in most cases and security may improve with virtualization as more VMs can create more information silos.
In larger organizations the management software integration issue has a greater impact as large scale management solutions are more complex to integrate. Microsoft has done a good job of integrating the System Center product suite right into the management of Hyper-V right out of the gate so this makes it pretty straight forward for an organization that is a homogeneous Microsoft environment.
In either case, it is great to see the growth of the needed support products for virtualization and this will continue to drive the success of virtualization solutions in all businesses.
Monday, April 13, 2009
Social Medial - World Wide Rave and Groundswell
I have been thinking about social media.
I read two books on this subject: Groundswell and World Wide Rave. Both books are excellent. The groundswells on the internet are fascinating because they happen so quickly and cannot be stopped once they are started. Even the creator of information, a video or application on the internet usually cannot stop the groundswell once it starts. Like most people, I had a basic understanding of some of these internet explosions - Wikipedia, MySpace and Facebook. I had not really thought much about how to put this to work. Both these books are very fascinating with many stories about how different people were successful at making the internet work for them by creating a groundswell or internet rave. Many that were significant but you may not have heard about them.
This is very exciting because once you have read the multiple stories the penny drops and you start thinking of all the ways you or your company could take advantage of the internet and social media. It also strikes you that there are many more very large opportunities and that we are just now seeing the tip of the iceberg. There is also somewhat of a gold rush – a second wave of the internet boom and it is upon us.
Here are two examples on YouTube groundswell videos:
1. “Will it Blend” Brought to you by BlendTec
http://www.youtube.com/watch?v=qg1ckCkm8YI
2. “Where the Hell is Matt?”
http://www.youtube.com/watch?v=zlfKdbWwruY
I read two books on this subject: Groundswell and World Wide Rave. Both books are excellent. The groundswells on the internet are fascinating because they happen so quickly and cannot be stopped once they are started. Even the creator of information, a video or application on the internet usually cannot stop the groundswell once it starts. Like most people, I had a basic understanding of some of these internet explosions - Wikipedia, MySpace and Facebook. I had not really thought much about how to put this to work. Both these books are very fascinating with many stories about how different people were successful at making the internet work for them by creating a groundswell or internet rave. Many that were significant but you may not have heard about them.
This is very exciting because once you have read the multiple stories the penny drops and you start thinking of all the ways you or your company could take advantage of the internet and social media. It also strikes you that there are many more very large opportunities and that we are just now seeing the tip of the iceberg. There is also somewhat of a gold rush – a second wave of the internet boom and it is upon us.
Here are two examples on YouTube groundswell videos:
1. “Will it Blend” Brought to you by BlendTec
http://www.youtube.com/watch?v=qg1ckCkm8YI
2. “Where the Hell is Matt?”
http://www.youtube.com/watch?v=zlfKdbWwruY
Friday, April 10, 2009
What Got You Here Won’t Get You There by Marshall Goldsmith
I read “What Got You Here Won’t Get You There” about a month ago and it really struck a chord for me. The book is pretty straight forward. Marshall Goldsmith, the author, walks through 20 bad habits that someone might have and says most people will have 5 – 8. The idea is to identify your bad habits you have and change! Of course the change is the hard part and you have to want to change.
Marshall Goldsmith does a good job explaining why a person should change and that it is possible. He takes it to the next level and says not only is it possible to change but if you do not it will hold you back if it is not already.
I identified my bad habits and set out to change the first one that I thought was the worst - listening. The book explains that everyone has the ability to listen – when they are in an important interview or meeting for example. The challenge is staying engaged like you are in an important meeting all the time. Bill Clinton was a master at this as are many politicians.
Marshall Goldsmith gives the reader different tactics to become a better listener. One technique to change that I liked was to ask friends, relatives and even strangers you are sitting next to on a flight this question: “What two things can I do in the future to be a better listener?” Of course, you would insert your bad habit in for the last two words. Then for the people challenged with listening like me, you are only permitted to say thank you and not argue or disagree. Marshall Goldsmith describes this technique as feed forward as opposed to feed back and it is less critical and therefore more acceptable. I have done this now several times and it is not too painful. The idea of asking a stranger seemed odd to me at first but the stranger of course has nothing to gain or lose so they may give insightful information – I look forward to testing this out when the opportunity arrives.
The book is very short, to the point and for me, needed. I suppose there are those who have no personality bad habits or they are so minor, there is no need for change. If this is the case for you, no need to read the book! However, if you are like me and most people, I think the book is a great easy read and will help you change. Although, after one month, I am struggling to change! :)
Marshall Goldsmith does a good job explaining why a person should change and that it is possible. He takes it to the next level and says not only is it possible to change but if you do not it will hold you back if it is not already.
I identified my bad habits and set out to change the first one that I thought was the worst - listening. The book explains that everyone has the ability to listen – when they are in an important interview or meeting for example. The challenge is staying engaged like you are in an important meeting all the time. Bill Clinton was a master at this as are many politicians.
Marshall Goldsmith gives the reader different tactics to become a better listener. One technique to change that I liked was to ask friends, relatives and even strangers you are sitting next to on a flight this question: “What two things can I do in the future to be a better listener?” Of course, you would insert your bad habit in for the last two words. Then for the people challenged with listening like me, you are only permitted to say thank you and not argue or disagree. Marshall Goldsmith describes this technique as feed forward as opposed to feed back and it is less critical and therefore more acceptable. I have done this now several times and it is not too painful. The idea of asking a stranger seemed odd to me at first but the stranger of course has nothing to gain or lose so they may give insightful information – I look forward to testing this out when the opportunity arrives.
The book is very short, to the point and for me, needed. I suppose there are those who have no personality bad habits or they are so minor, there is no need for change. If this is the case for you, no need to read the book! However, if you are like me and most people, I think the book is a great easy read and will help you change. Although, after one month, I am struggling to change! :)
Thursday, April 09, 2009
IT Skills: What’s Hot and What’s Not
I read two articles on IT skills in Network World: “Top 10 Technology Skills” and “5 IT skills that won't boost your salary”
The 5 IT Skills that are declining included: HTML, NetWare, PC Tech support, legacy languages like Cobol, and Non IP like SNA. No surprise here! Why not put Token Ring on the list and in depth experience with the Intel 8086 processor? The list could go on for pages of legacy technology. The challenge for some businesses hanging on to legacy technology is finding someone who can actually support their systems.
The “Top 10 Technology Skills” is a good indication of the state of the industry and where it is going. The list starts with business process modeling as being number one and this explains why you see many businesses getting certifications like CMMI, ISO 9000, and ITIL. The rest of the list is: database, message/ communications, IT Architecture, IT security, project management, data mining, web development (Web 2.0), IT optimization, and networking. The article goes into detail on each.
Missing is virtualization and the soft skills around business IT alignment. Personally, I believe that the ability to translate the business requirements into IT plans and priorities is the number one skill. When you read CIO books and articles this is generally the challenge discussed. Otherwise, the list is accurate and in line with the challenges of IT leaders.
How can an IT Professional use this information?
First select the skills in the list that fit your background and study and research to develop and grow. Once you have developed the skills, take the appropriate certification exams like PMP, ITIL, MCITP (MCSE), CCIE, CISSP.
Now that you have developed your skills and certified you are done – right? No! You chose to be an IT professional and this industry is constantly changing so you need to continue to read the latest information, take update classes and courses on the new technology as it comes out and re-certify.
If you can maintain the passion for the technology – You will love it!
Network World - Top 10 Technology Skills http://www.networkworld.com/news/2009/040609-10-tech-skills.html?tc=car
Network World - 5 IT skills that won't boost your salary http://www.networkworld.com/news/2008/041708-careers-sidebar.html?page=1
The 5 IT Skills that are declining included: HTML, NetWare, PC Tech support, legacy languages like Cobol, and Non IP like SNA. No surprise here! Why not put Token Ring on the list and in depth experience with the Intel 8086 processor? The list could go on for pages of legacy technology. The challenge for some businesses hanging on to legacy technology is finding someone who can actually support their systems.
The “Top 10 Technology Skills” is a good indication of the state of the industry and where it is going. The list starts with business process modeling as being number one and this explains why you see many businesses getting certifications like CMMI, ISO 9000, and ITIL. The rest of the list is: database, message/ communications, IT Architecture, IT security, project management, data mining, web development (Web 2.0), IT optimization, and networking. The article goes into detail on each.
Missing is virtualization and the soft skills around business IT alignment. Personally, I believe that the ability to translate the business requirements into IT plans and priorities is the number one skill. When you read CIO books and articles this is generally the challenge discussed. Otherwise, the list is accurate and in line with the challenges of IT leaders.
How can an IT Professional use this information?
First select the skills in the list that fit your background and study and research to develop and grow. Once you have developed the skills, take the appropriate certification exams like PMP, ITIL, MCITP (MCSE), CCIE, CISSP.
Now that you have developed your skills and certified you are done – right? No! You chose to be an IT professional and this industry is constantly changing so you need to continue to read the latest information, take update classes and courses on the new technology as it comes out and re-certify.
If you can maintain the passion for the technology – You will love it!
Network World - Top 10 Technology Skills http://www.networkworld.com/news/2009/040609-10-tech-skills.html?tc=car
Network World - 5 IT skills that won't boost your salary http://www.networkworld.com/news/2008/041708-careers-sidebar.html?page=1
Wednesday, April 08, 2009
The Goal: A Process of Ongoing Improvement by Eliyahu M. Goldratt and Jeff Cox
The Goal was a fun book to read. The strange thing is - I purchased the book about five years ago based on a recommendation of Verne Harnish and then proceeded to only read half the first chapter. The book was written as a fiction telling a business story and I don’t love this style. After Verne insisting that it is a great book, I picked it back up a few months ago and read it. I really enjoyed the book and I list it as one of my three favorites now!
The Goal‘s protagonist is a manufacturing plant manager. The plant is not profitable so the plant manager is given 3 months to turn it around or upper management is shutting it down.
The plant is a mess with delays on orders, causing very angry customers. There are battles over what projects should be prioritized. The changes in the production plan create problems because management is switching out projects just when hours have been spent setting up everything for a different project. Management is arguing about everything, and the Union is filing grievances – a real mess.
The Plant manager meets a “wise man” who agrees to give consulting as a mentor. The story than walks you through the challenges, the changes and the impact as the plant manager makes dramatic changes to drive profitability in the plant. The message is to align everyone around the goal of making the plant profitable, and implement common sense to attain the goal!
Although the book has very little to do with technology, it is a great book. I think it is a must read for anyone in management. It really helped me look at business challenges from a different perspective.
The Goal‘s protagonist is a manufacturing plant manager. The plant is not profitable so the plant manager is given 3 months to turn it around or upper management is shutting it down.
The plant is a mess with delays on orders, causing very angry customers. There are battles over what projects should be prioritized. The changes in the production plan create problems because management is switching out projects just when hours have been spent setting up everything for a different project. Management is arguing about everything, and the Union is filing grievances – a real mess.
The Plant manager meets a “wise man” who agrees to give consulting as a mentor. The story than walks you through the challenges, the changes and the impact as the plant manager makes dramatic changes to drive profitability in the plant. The message is to align everyone around the goal of making the plant profitable, and implement common sense to attain the goal!
Although the book has very little to do with technology, it is a great book. I think it is a must read for anyone in management. It really helped me look at business challenges from a different perspective.
How much of the Market is Cloud Computing?
Last week I posted a similar article and I made estimates that were not correct so I deleted the previous post and updated this to reflect the new data I obtained.
The world market growth of cloud computing is growing from an estimated $56 Billion this year to $150 Billion in 2013 according to Gartner. Merrill Lynch estimated as much as $160 Billion in 2011.
So what percentage of the world computing market is cloud computing?
According to Gartner, as reported by Reuters, world Information Technology spending in 2009 will be 3.5 Trillion. I would estimate this will grow to 4.2 Trillion in 2013. This equates to a current world cloud computing market at less than 2% of the entire market today and around 4% somewhere between 2011 and 2013.
Well, 2% - 4% sounds very small! The important point is the actual dollars and the growth rate. Not only is the market growing but the growth rate is growing! If the growth rates of world Information Technology and world cloud computing stay relatively constant colud computing breaks through 10% in 10 years 2019 and reaches 30% five years later 2024. The growth will slow at some point but it may accelerate further before it slows! The other key thing to watch here is what segments of technology are served well by the cloud. Email and CRM are leading in this space as well as IT support services with remote monitoring and remediation. (Incidentally if cloud computing and information technology growth could stay at a constant growth rate then in 2030 cloud computing would be 100% of the market)
What are the implications of this?
Although cloud computing is in it infancy it is growing very fast and will probably grow faster before slowing. Sometime in the next 5 years CIOs and IT Directors will have to develop a cloud computing strategy. IT companies in the cloud computing space will continue to benefit and those that are not will suffer.
Source: Reuters
http://www.reuters.com/article/rbssSoftware/idUSN1338463520081013
The world market growth of cloud computing is growing from an estimated $56 Billion this year to $150 Billion in 2013 according to Gartner. Merrill Lynch estimated as much as $160 Billion in 2011.
So what percentage of the world computing market is cloud computing?
According to Gartner, as reported by Reuters, world Information Technology spending in 2009 will be 3.5 Trillion. I would estimate this will grow to 4.2 Trillion in 2013. This equates to a current world cloud computing market at less than 2% of the entire market today and around 4% somewhere between 2011 and 2013.
Well, 2% - 4% sounds very small! The important point is the actual dollars and the growth rate. Not only is the market growing but the growth rate is growing! If the growth rates of world Information Technology and world cloud computing stay relatively constant colud computing breaks through 10% in 10 years 2019 and reaches 30% five years later 2024. The growth will slow at some point but it may accelerate further before it slows! The other key thing to watch here is what segments of technology are served well by the cloud. Email and CRM are leading in this space as well as IT support services with remote monitoring and remediation. (Incidentally if cloud computing and information technology growth could stay at a constant growth rate then in 2030 cloud computing would be 100% of the market)
What are the implications of this?
Although cloud computing is in it infancy it is growing very fast and will probably grow faster before slowing. Sometime in the next 5 years CIOs and IT Directors will have to develop a cloud computing strategy. IT companies in the cloud computing space will continue to benefit and those that are not will suffer.
Source: Reuters
http://www.reuters.com/article/rbssSoftware/idUSN1338463520081013
Tuesday, April 07, 2009
My Book Reading List
I have read five business books recently and have a list of 10 more on my reading list – Currently reading Groundswell and The Art of Profitability. My goal is to get through the ten books this quarter. Several of these books I selected from CIO Insight Top books for IT Leaders. I will write a quick synopsis on each book as I go through them. Reading books is one of the best ways to fuel new ideas and keep you energized at least it is for me.
Books I read recently:
1. The Goal by Eliyahu M. Goldratt and Jeff Cox
2. What Got You Here Won’t Get You There by Marshall Goldsmith
3. The World Wide Rave by David Meerman Scott
4. The New CIO Leader by Marianne Broadbent and Ellen Kitzis
5. The Next Leap in Productivity by Adam Kolawa
Books I am reading or plan to read
1. Groundswell, Charlene Li and Josh Bernhoff
2. The Art of Profitability by Adrian Slywotzky
3. The Catalyst by Jeanne Liedtka
4. Greater than Yourself by Steve Farber
5. Think Again: Why Good Leaders Make Bad Decisions by Sydney Finkelstein
6. Discovery Driven Growth by Rita Gunther McGrath
7. Billion Dollar Lesson by Paul Carroll and Chunka Mui
8. What would Google Do? by Jeff Jarvis
9. The 100 Best Business Books of All Time by Jack Covert
10. Partnering with Microsoft by Ted Dinsmore and Edward O’Connor
Books I read recently:
1. The Goal by Eliyahu M. Goldratt and Jeff Cox
2. What Got You Here Won’t Get You There by Marshall Goldsmith
3. The World Wide Rave by David Meerman Scott
4. The New CIO Leader by Marianne Broadbent and Ellen Kitzis
5. The Next Leap in Productivity by Adam Kolawa
Books I am reading or plan to read
1. Groundswell, Charlene Li and Josh Bernhoff
2. The Art of Profitability by Adrian Slywotzky
3. The Catalyst by Jeanne Liedtka
4. Greater than Yourself by Steve Farber
5. Think Again: Why Good Leaders Make Bad Decisions by Sydney Finkelstein
6. Discovery Driven Growth by Rita Gunther McGrath
7. Billion Dollar Lesson by Paul Carroll and Chunka Mui
8. What would Google Do? by Jeff Jarvis
9. The 100 Best Business Books of All Time by Jack Covert
10. Partnering with Microsoft by Ted Dinsmore and Edward O’Connor
Monday, April 06, 2009
Is Virtualization the Next Killer Application?
I love Wikipedia and here is Wikipedia’s definition of killer application:
A killer application (commonly shortened to killer app), in the jargon of computer programmers and video gamers, has been used to refer to any computer program that is so necessary or desirable that it proves the core value of some larger technology, such as computer hardware like a gaming console, operating system or other software. A killer app can substantially increase sales of the platform that it runs on.
My list of killer applications for businesses in general is:
1.Word processor
2.Spreadsheet
3.Presentation Application
4.Email
5.Contact and Calendar on PDA
6.Mapping and GPS in cars
7.Virtualization
Clearly there are probably some I missed but this is my list and I stand by it! There ought to be one in there around the internet maybe the browser but it’s not really an application so I will leave the list as stands.
The challenge with virtualization is that it is a behind the scenes application so it may never be considered a mainstream application? When you consider Wikipedia’s definition of being "so necessary" I think that virtualization meets the requirement.
Virtualization is a game changer and in this slow economy the fundamental cost savings of utilizing computer hardware better is driving organizations to implement virtualization. Once they gain the benefit of better hardware utilization it is not long before they start implementing the second tier benefits of disaster recovery, high availability and manageability.
Here is an article in Tech News World on this shift: “A Strategic View of Virtualization”
http://www.technewsworld.com/story/66724.html
A killer application (commonly shortened to killer app), in the jargon of computer programmers and video gamers, has been used to refer to any computer program that is so necessary or desirable that it proves the core value of some larger technology, such as computer hardware like a gaming console, operating system or other software. A killer app can substantially increase sales of the platform that it runs on.
My list of killer applications for businesses in general is:
1.Word processor
2.Spreadsheet
3.Presentation Application
4.Email
5.Contact and Calendar on PDA
6.Mapping and GPS in cars
7.Virtualization
Clearly there are probably some I missed but this is my list and I stand by it! There ought to be one in there around the internet maybe the browser but it’s not really an application so I will leave the list as stands.
The challenge with virtualization is that it is a behind the scenes application so it may never be considered a mainstream application? When you consider Wikipedia’s definition of being "so necessary" I think that virtualization meets the requirement.
Virtualization is a game changer and in this slow economy the fundamental cost savings of utilizing computer hardware better is driving organizations to implement virtualization. Once they gain the benefit of better hardware utilization it is not long before they start implementing the second tier benefits of disaster recovery, high availability and manageability.
Here is an article in Tech News World on this shift: “A Strategic View of Virtualization”
http://www.technewsworld.com/story/66724.html
Friday, April 03, 2009
Scrum and Extreme Programming (XP)
Scrum and Extreme Programming (XP) and the entire agile development methodologies seem to be a buzz in the software development world. Driving this trend is the business need for dynamic solutions at hyper fast speeds.
Scrum is a methodology developed over the last 20 years and is focused on shorter windows of time between releases in sprints of 2 – 4 weeks. In order to accomplish this there has to be essentially constant QA and refining the scope down to just a few items between each sprint and being sure these are the correct features to include in the next release. The client really likes it because they get usable software sooner and they get more communications throughout the process. I think the main drawback is that some of the overall architecture is going to suffer. It looks like the direction the industry is headed so if you are involved in software development you probably want to take a closer look.
Here is a description of Scrum in Wikipedia and a link for the complete Wikipedia article:
Scrum (development)
From Wikipedia, the free encyclopedia
Scrum is an iterative incremental process of software development commonly used with agile software development. Despite the fact that "Scrum" is not an acronym, some companies implementing the process have been known to adhere to an all capital letter expression of the word, i.e. SCRUM. This may be due to one of Ken Schwaber's early papers capitalizing SCRUM in the title.[1]
Although Scrum was intended for management of software development projects, it can be used to run software maintenance teams, or as a program management approach.
http://en.wikipedia.org/wiki/Scrum_(development)
Scrum is a methodology developed over the last 20 years and is focused on shorter windows of time between releases in sprints of 2 – 4 weeks. In order to accomplish this there has to be essentially constant QA and refining the scope down to just a few items between each sprint and being sure these are the correct features to include in the next release. The client really likes it because they get usable software sooner and they get more communications throughout the process. I think the main drawback is that some of the overall architecture is going to suffer. It looks like the direction the industry is headed so if you are involved in software development you probably want to take a closer look.
Here is a description of Scrum in Wikipedia and a link for the complete Wikipedia article:
Scrum (development)
From Wikipedia, the free encyclopedia
Scrum is an iterative incremental process of software development commonly used with agile software development. Despite the fact that "Scrum" is not an acronym, some companies implementing the process have been known to adhere to an all capital letter expression of the word, i.e. SCRUM. This may be due to one of Ken Schwaber's early papers capitalizing SCRUM in the title.[1]
Although Scrum was intended for management of software development projects, it can be used to run software maintenance teams, or as a program management approach.
http://en.wikipedia.org/wiki/Scrum_(development)
Thursday, April 02, 2009
Virtualization - What is Holding Some Companies Back?
I have been following virtualization stories over the past weeks and there is a constant flow of articles about this company and that company virtualizing production servers and the benefits associated with a virtual environment. Here at Nortec we virtualized from 21 physical servers at our head quarters to 5 physical servers and never looked back. We now have plenty of old servers we can use in labs at each of our offices and we are very happy with the solution. We have also integrated VDI in some offices but this has been limited as we have a mobile workforce with laptops primarily.
I have posted multiple articles this year on the wave of virtualization. Most companies have either already created a virtual environment or plan to do so this year.
One question is: What is holding some companies back from virtualization and are they moving fast enough?
I came across this Forrester report commissioned by Cisco in January, 2009. Essentially what the report found is that “consolidations is still a strong motivation, but equally important are improving disaster recovery and improving server flexibility.” A Large percentage (60% - 70%) of companies surveyed are using the advanced virtualization capabilities.
A challenge with virtualizing corporate infrastructure is that almost 50% of companies using virtualization have only been working with it for 1 – 2 years and less than 20% have used it for more than four years. Clearly not everyone has figured out what to do!
Forrester Consulting outlines 4 Steps:
1.Standardize
2.Consolidate
3.Advanced workload management
4.Share resources between application domains with service level guarantees
For the details here is a link to the Forrester report on Cisco’s web site:
http://www.cisco.com/en/US/solutions/collateral/ns340/ns856/ns872/virtualization_C11-521100-0Forrester.pdf
I have posted multiple articles this year on the wave of virtualization. Most companies have either already created a virtual environment or plan to do so this year.
One question is: What is holding some companies back from virtualization and are they moving fast enough?
I came across this Forrester report commissioned by Cisco in January, 2009. Essentially what the report found is that “consolidations is still a strong motivation, but equally important are improving disaster recovery and improving server flexibility.” A Large percentage (60% - 70%) of companies surveyed are using the advanced virtualization capabilities.
A challenge with virtualizing corporate infrastructure is that almost 50% of companies using virtualization have only been working with it for 1 – 2 years and less than 20% have used it for more than four years. Clearly not everyone has figured out what to do!
Forrester Consulting outlines 4 Steps:
1.Standardize
2.Consolidate
3.Advanced workload management
4.Share resources between application domains with service level guarantees
For the details here is a link to the Forrester report on Cisco’s web site:
http://www.cisco.com/en/US/solutions/collateral/ns340/ns856/ns872/virtualization_C11-521100-0Forrester.pdf
Subscribe to:
Posts (Atom)