Wednesday, 31 July 2013

Evolving role of an internal auditor

With rapid changes in technology the role of the traditional audit also has undergone a change. Business are now more global in nature, new technologies have emerged and have also brought along with it a whole new avenue of risks and compliance service requirements that need to be addressed. The role of an internal auditor is no longer restricted to verifying if controls are being implemented and processes are being adhered to. With process re-engineering being a buzzword, organizations are looking towards the auditor to play a more dynamic role and be in sync with the changes to the processes and controls as well.

The traditional auditor responsibilities was restricted to
  1. Auditing of processes against set benchmarks
  2. Audit the processes with respect to any compliance requirements
  3. Reporting on non conformance to the top management
As the organizations grew, auditors also played an important role in assisting the business in implementing remediation steps in order to fix gaps that were highlighted during the audit.

As the nature of business gets more complex and the risk increases, modern internal auditors also have a varied nature of responsibilities. Apart from the traditional duties of risk identification and reporting, they also need to play an active role in measuring process efficiency. Internal auditors need to access the effectiveness of the controls implemented. They need to ensure that the controls meet the objective. Internal auditors also suggest improvement measures that than ensure that risks are minimized in a more effective way.

Apart from internal strategic requirements, the diverse and complex nature of business has also give rise to a whole new set of rules and regulations that the organizations need to comply with. The modern internal auditor needs to evolve continuously in order to keep pace with the changing requirements. Auditors need to be aware of the latest compliance requirements that the organization needs to comply with in its nature of business. They also need to be aware of the latest tools and reporting structure that needs to be adhered to.
In addition to this, internal auditors also participate in process effectiveness measurement and are expected to highlight areas where process control effectiveness can be improved.

The internal auditor’s role has been ever evolving as organization, process complexities have grown exponentially. Beginning from the traditional role of testing controls for IT compliance management services, the new responsibilities now include keeping pace with current technologies and identifying the best practices in adhering to organization wide and compliance management requirements, suggesting improvements to existing process to make adherence to compliance more efficient.

About Author:
Harish Mani  is senior consultant and part of Systems Plus Pvt. Ltd. think tank. He actively contributes to technology and information security. He can be contacted at: harish.m@spluspl.com

Monday, 29 July 2013

VMO because –

ONE SIZE DOESN'T FIT ALL

Each company has their unique needs which may be different from that of other companies. This is the reason no two vendors are similar and therefore sometimes making an attempt to force a relationship with a vendor which contrasts one’s company culture, volume can be like trying to fit a square peg in round hole.

There is no denying to the fact that organizing one’s vendors can save a considerable amount of money, but not every vendor or deal gets same amount of attention from the vendor management office. The aim is to allot resources to the relationships that have major impact on enterprise strategy.

It’s necessary to first know what one is looking for in a vendor. Then weigh the vendors depending on the most crucial things, things which fall into tie breaker category. Having a clear vision how the vendor would be selected saves a lot of guesswork and some serious headaches.

Fix a target and then make a list. This would give a clear picture of how much a vendor needs to deliver in a period. Keep a track of all the past relationships, which vendor performed extremely well and which vendor disappointed.

Having a good vendor management system in place is not a simple process. It is a continuous process of maintaining communication with vendors, so by knowing and communicating one’s expectations to supplier one can lessen the possibility of misunderstanding and problem in supply. By having a good vendor management, a company will be in a position to be successful.


IT SOURCING IS NOT "BUYING TOILET PAPER"

Vendor management is a complex task. It has a wide range of influence on the business. Maintaining good relationships with the existing vendors’ means on-time and efficient delivery.  On the other hand, poor vendor management system can lead to bottle-necking and other inefficiencies. Vendor management comes down to a lot more than simply trying to haggle for the lowest price. Enterprises that have a formal vendor management group clearly gain both monetary and strategic advantages.

Communicating with vendors is just as important as communicating with customers. Having proper communication with vendors builds a win-win relationship with vendors as it leads to increased efficiency, reduced costs, better customer service and more over it builds trust amongst two parties.

Gone are the days when organizations focused on price, without understanding the underlying technology issues that affects IT’s ability to serve business needs. Today, large organizations need vendor management because of their scale.

About Author:
Nisha Tolani is consultant and part of Systems Plus Pvt. Ltd. She is a part of consulting team that delivers Sourcing and Vendor Management Office projects. She can be contacted at: nisha.t@spluspl.com

Skills For Successful Project Management

With the re-opening of schools and colleges, comes the opening season for various training institutes with lots of skills to sell.Whilst some skills are hard pressed confining you to the four walls of class room, some really can’t be pressed on in the confines of training room.  Let’s try to understand the different skills and its applicability to the Profession of “Project Management”.

Let’s first try to understand what soft skills are, Soft skills are the skills that are common requirement but with different proportion for any job role or activity that we take up in our professional and personal life examples: Negotiation skills, Communication skills, Leader ship, Influencing skills, Interpersonal skills. These are the skills that we keep learning as we advance in our careers or shall I say we can progress in our careers only if we keep developing these skills. The environment in which we work and live makes a huge impact on our abilities to pick these skills for Ex: Sales guys are considered excellent in Negotiation while Scientist, Technicians etc. are considered to have excellent analytical skills. Nowhere does this mean Sales guys can’t do analysis or Scientist / technicians can’t negotiate it just means that these are not there core competencies.

Every Industry and Profession has its own requirements for soft skills. Let’s try to list down and relate some bare basic skills required for Prized Profession “IT Project Management” irrespective of Organizational Structure, Project Environment, Geographical location etc.


Communication Skills


Don’t we do this day in day out both in personal and professional life, we speak, we convey message through SMS, mails, chats, blogs, eye movements (body language speaks the unspoken words frowning, wondering etc.) etc. with parents, wife, boss, kids, friends,customers, relatives and yes even with strangers :).

“90 % of a project manager's time is spent communicating“: Source PMI.org

Are we talking here about the language (English, Spanish, Hindi, Marathi etc…?)No, While Language is a tool for communication; it is also the biggest barrier for any communication. We are talking here about the skills to make communication effective be it verbal (Written, email, telephone etc.) or non-verbal (Gesture, Tone, dress code, etc).

Some tips for verbal communication.
  1. Learn and gain expertise in the language required for communication.
  2. Understand the medium of Communication
  3. Build vocabulary and grammar
  4. Listen actively don’t just hear
  5. Organize thoughts and rework till thoughts are clear and concise
  6. Form Clean, Clear and Concise sentence

Leadership


Who doesn’t like to have followers be it conventional way or the social media way? Everyone likes to get that like& Comment on Facebook, twitter, blogger or LinkedIn. Be a leader, whom people like, whom people follow, whom people want to be influenced by, be the one for who people can go that extra mile.

“Managing people won’t make you a good leader but leading people will make you a good manager.”

Are we talking about authority over others actions? Is it about expertise over the respective subject area?  No, it’s about motivating and influencing others to get the work done utilising their expertise and interest. You need not be an expert in all areas, but then you should neither be shy to say to ask for help from others. You may not always have authority over all stakeholders involved in project but then, you can build reputation, where people start following you.

Some traits found in Good Leaders.
  1. Maintain highest levels of Integrity and Ethics
  2. Relate to people and their concerns
  3. Practice Before Preach
  4. Set Right Examples
  5. Communicate

Negotiation Skills


Again something that I am sure most of us enjoys doing both professionally and personally. Don’t we do this while buying groceries, while settling disputes with friends and strangers and even enemies? (Practice with caution with wife chances are you may lose, they are experts in negotiation using any and all tools and tricks)


But why do we negotiate?
  1. To make a gain
  2. To be superior
  3. To win over others
  4. Out Of Fear
  5. Out of situational demands.
How do we negotiate?  (Reminds me Of Great Philosopher Chanakya)
  1. Speak / Explain / Discuss
  2. Buy in / Pay Surplus / extra benefits
  3. Reprimand / Horrify / Intimidate / Insult
  4. Attack / Forceful assertion / Dictatorship
What should be outcome of Negotiation?
  1. Win-win Situation for both / all parties involved in negotiation. (Not possible always)
  2. Mutual Understanding. (Comes with the way negotiation was handled)
  3. Gain for which negotiation was carried out. (may not always be to what was planned)
Some Tools and Techniques Useful for effective negotiation
  1. Good Cop - Bad Cop
  2. Snow Ball
  3. Flinch
  4. Bogey
  5. The Nibble
  6. Deadline
  7. And Many More…
Research and studies carried out globally conclude that less than 35% Projects Succeed (Source: http://pmi.org ). What’s more surprising is “80% IT Project challenges are caused by People Challenges” (Source: https://www.pmi.org) some more research’s suggested below highlighted issues to be major factors affecting Project success rate.
  1. Inadequately trained or inexperienced project managers
  2. Failure to set and manage expectations
  3. Poor leadership at any and all levels
  4. Cultural and ethical misalignment
  5. Misalignment between project team and business/organization it serves
  6. Inadequate communication, incl. progress tracking and reporting
A clear indicator, to invest in training and grooming for the Project Team Members.

Hope this blog helps you, realize the value that right soft skills can bring to the projects and project organisation. I look forward for your views and opinions both agreements and disagreements.


About Author:
Pradip Sadare is Operations Manager with Systems Plus Pvt. Ltd. He enjoys relating management with everyday life aspects and works in Managed Captive Model(click here to know more about our managed captive offerings). He can be contacted at: pradip.s@spluspl.com

Friday, 26 July 2013

Early Contract Renewal

Renegotiating / Renewal of contracts are considered as a win-win scenario for both client and vendor. Considering today’s outsourcing environment, it is not a choice but a necessity. In long term (three years or more) relationships client’s business and technology needs keeps changing. Therefore it is advisable that both sides are flexible enough to modify the agreement over the life of the contract.

Clients view renegotiating / renewal of existing contracts as a highly effective strategy to revise / align scope, pricing and build a mutually beneficial relationship with vendor.


CLIENT CONSIDER FOLLOWING BEFORE RENEGOTIATING / RENEWING CONTRACT

  1. Pricing model
    • Discrepancy between current market pricing and contract rates
  2. Operating model
    • How effective is the current delivery model (near shore, offshore, onshore)
    • Rate of delivery failure
  3. Scope of service
    • Meets existing business needs
    • Ability to meeting future / revised business needs
  4. Quality of service and risk
    • Review quality of services / deliverables as per SLA
    • Identify risks not covered in SLA
  5. Contract terms
    • Flexibility provided as per terms mentioned in contract
    • List of terms that are no longer relevant
    • Incorporating new terms based on lessons learned / industry good practice
  6. Will insourcing or switching providers be trading one set of problems for another?

KEY REASON TO RENOGOTIATE / RENEW CONTRACTS

  1. Expand / change in scope because of new offering / needs
  2. Change / Clarity in pricing
    • Fluctuating market conditions that may result in a client paying prices that are higher than current market rates
    • Move from time and materials to a fixed-price model or vice versa
    • Move to a pricing model that better represents a long-term relationship
    • Adjust pricing because of added scope
  3. Realign both parties interests and strengthen relationship
  4. Technological innovations (such as grid and cloud computing) that may affect the effectiveness of an outsourcing agreement
  5. Clarify contract terms
    • Regulatory changes (such as new privacy and security policies)
    • Clarify some terms, considering the relationship grew deeper and more collaborative than original envisioned
    • Restructure contract to allow for continuous growth without having to renegotiate the contract every time
    • Establish new billing metrics regarding what constitutes an added resource cost (ARC)
  6. Unsatisfactory service levels and quality issues

QUALITIES IN VENDOR THAT TRIGGER EARLY RENEWAL

  1. Flexibility
  2. Feeling of partnership and One team
  3. Honesty / Integrity
  4. Customer focus
  5. Overall performance

CLIENT ADVANTAGES: RENEGOTIATING / RENEWING

  1. Delivery aligned with business needs
  2. Avoid new vendor selection process
  3. No transition disruptions
  4. Delivery challenges are addressed
  5. Align cost with the market price
  6. Liberty to revise / add new terms in the contract

CLIENT RISKS: MIGRATING TO NEW VENDOR

  1. Service disruption
  2. Transition costs
  3. Loss of knowledgeable resources
  4. Complexity in managing multiple providers
About Author:
Tina Nebhnani is consultant and part of Systems Plus Pvt. Ltd. She is a part of consulting team that delivers Sourcing and Vendor Management Office projects. She can be contacted at: tina.n@spluspl.com

Tuesday, 23 July 2013

ASP.NET Page Life Cycle

In this article I’ll talk about the ASP.NET page life cycle. We will try to see what all events are important for an ASP.NET developer and what can be achieved in these events.
As an ASP.NET developer, it is essential to understand the ASP.NET application life cycle and Page life cycle. With the ease of development provided by Visual Studio, sometimes new programmers get started with writing ASP.NET pages without understanding the Application and Page life cycle.

From an end user's perspective, a request for a web page is made to the web server and web server will return the page to the user. So simple isn’t it J

However for a little bit technical users, we can also state that the web server will receive the request, perform some server side activities like reading from database, process data received and return the output back to the user.
As an ASP.NET developer, one need to understand how this request is being processed i.e. the Application life cycle and how the web page is being processed and getting served to the user i.e. the Page life cycle.

When an ASP.NET page runs, the page goes through a life cycle in which it performs a series of processing steps.
Steps like; initialization, instantiating controls, restoring and maintaining state, running event handler code, and rendering.
It is important for you to understand the page life cycle so that you can write code better and at the appropriate stage of life-cycle for the effect you intend.

Let’s see Page life cycle stages

Stage
Description
Page request
The page request occurs before the page life cycle begins. When the page is requested by a user, ASP.NET determines whether the page needs to be parsed and compiled (therefore beginning the life of a page), or whether a cached version of the page can be sent in response without running the page.
Start
In the start stage, page properties such as Request and Response are set. At this stage, the page also determines whether the request is a postback or a new request and sets the IsPostBack property. The page also sets the UICulture property.
Initialization
During page initialization, controls on the page are available and each control's UniqueID property is set. A master page and themes are also applied to the page if applicable. If the current request is a postback, the postback data has not yet been loaded and control property values have not been restored to the values from view state.
Load
During load, if the current request is a postback, control properties are loaded with information recovered from view state and control state.
Postback event handling
If the request is a postback, control event handlers are called. After that, the Validate method of all validator controls is called, which sets the IsValid property of individual validator controls and of the page. (There is an exception to this sequence: the handler for the event that caused validation is called after validation.)
Rendering
Before rendering, view state is saved for the page and all controls. During the rendering stage, the page calls the Render method for each control, providing a text writer that writes its output to the OutputStream object of the page's Response property.
Unload
The Unload event is raised after the page has been fully rendered, sent to the client, and is ready to be discarded. At this point, page properties such as Response and Request are unloaded and cleanup is performed.

As I said earlier; it is important for an ASP.NET developer and what can be achieved in these events, similarly we also need to understand in which event or stage of life cycle what is available to us.
For example ViewState is available in which all events or stages of a life cycle – see below image.



We can find list of all events available in Page Lifecycle at Microsoft MSDN Library

When we use server controls on ASP.net Page, we do not care about the individual controls life cycle – Yes each individual server control has its own life cycle which is similar to the page lifecycle.
For example: each control’s Init and Load event occurs during the corresponding page events

Below image can show you the flow of page life cycle – for detail information click here (reference of Microsoft MSDN Library)




You can refer below links for more information on Lifecycle

Hope next time when you will be writing your code you will consider page lifecycle

About Author:
Harshad Pednekar is budding technology geek, who actively contributes to Systems Plus with his creativity and research on technology. To read more interesting articles from him , please follow: http://harshadpednekar.blogspot.in

Sunday, 21 July 2013

Cloud computing to take over IT

Cloud computing is the new age technology that has emerged. This technology is not only user-friendly but also big time savior. The biggest advantage of this is the availability and access of one’s data anywhere and anytime. This technology is on a wide use especially by the corporate world.

According to an analysis, statistics say the IT department in the corporate world will shrink by 75% by the next decade. The only reason behind this diminishing size is the increasing use of cloud technology which is also proved to be quite economical. Thus IT departments would take over the role of advising rather than developing software. This will result in the smaller sized IT teams who would be focusing on making IT easier to use.


The cloud computing technology taking over the IT department would not only affect the work flow of the IT department but it would also impact the other departments of the firm. These impacts can be listed as follows:
  • Helps the CIO’s increase their roles in the frim – The changes would help the CIO’s increase their roles and use their management skills in all the other departments like HR, Supply chain, etc.
  • Changes in the other IT roles – Around 80% of the people in the IT department would experience a lot of changes in their roles and responsibilities. Also these employees would be expected to better their skills as per the changing business requirements. Thus the traditional IT roles like a developer, datacentre admins or network admins will no longer exists in the IT work flow. All these would be replaced by all cloud services like supplier of the software etc.
  • IT department will only act as an advisory board – With the changing business requirements the IT department needs professionals with good negotiation skills, salesmen skills, financial and contract management skills which could be used to tackle the suppliers. Thus they help the other departments buy the IT systems they need.
  • Emergence of new IT roles – The emerging new technology of cloud has replaced a lot of traditional roles. The organizations need more of the service managers in their IT team which will help the firm obtain the right systems according to the requirement of each department.
  • Retraining for the IT department – The firm will help the employees to develop their collaborative skills so that they can survive in the new environment. The CIO’s suggested that there should be a long term workflow decided upon which would help the organization recruit the desired employees with the same management and collaborative skills for the same. As also they suggested that the existing employees are trained.
Thus with an indulgence in Cloud technology and a few other changes in the present system as mentioned above, an organization with a handful of trained professionals and a small sized IT department can achieve huge profits


About Author:
Mihir Sakhle is consultant and part of Systems Plus Pvt. Ltd. He is a part of consulting team that delivers Sourcing and Vendor Managementg Office projects. He can be contacted at: mihir.s@spluspl.com

Thursday, 18 July 2013

Business Analysis – Is it “domain” specific profession ?

The classic role of a business analyst is quite same irrespective of the industry he works in. He assumes the role of bridging the gap between the business and technology i.e. to translate business processes into functional requirement specifications which developers can understand.

Business Analyst responsibility is divided into different sub-tasks depending on the project phase. In the Requirements phase, the BA has to gather data regarding the pain points of the client and understand the client’s/business users expectations from the technology. The scope and the high level business requirements are captured at this stage. There are several requirement gathering techniques like Joint Application Development (JAD) sessions, surveys, interviews, etc. JAD is a methodology wherein the client/end user take’s active participation in the design and development phase.

Next, Design phase involves translation of the high level requirements into detailed level functional and non functional requirements. In this phase Test Lead would come up with test cases for the scenarios identified and the different tests were performed. This test cases has to be reviewed and approved by the BA. Trace-ability matrices can also used by BA to map the business and functional requirements. A BA is responsible for managing the change requests that can come up mainly during the development and testing phase of the project. And in the Testing phase, he has to approve the test results and also conduct BA testing and support UAT.

The above activities are inherent in a BA profile no matter which domain they work in. The skills related to these activities are also transferable when a BA moves from one domain to another. In fact, transferable skills are the key which demonstrates the competence of an aspiring business analyst. A good BA should be able to effectively utilize requirement elicitation techniques, soft skills and analytical skills plus focus on the process rather than the deliverables.

There are several personality traits which determine how good a BA is. The ability to negotiate prudently and perseverance are two traits tested most in a BA. The biggest challenge a business analyst faces is to interview a business user who would be his most important client. BA needs to convince them to give their time and effort on actually explaining them their business process. A business analyst needs to have good comprehension so that he can grasp the business processes quickly. This will help him ask questions quickly and thereby nail the client’s pain points sooner. IT skills will also help a BA to act as an ideal liaison between the business and the technical departments. Inquisitiveness, strategic thinking and an eye for detail are other important characteristic features a business analyst requires. As far as domain related knowledge is concerned, few basic nitty-gritty’s would always help BA in making his initial days in the new domain easy. However, as we saw, a good business analyst can easily pick up the domain with a little effort. Otherwise, his skills are the biggest factors which take a business analyst a long way. A good business analyst can always work his way into any domain with a small learning curve.

In the end i would say an adventurous business analyst would focus on working in more than one domain which will give him abundant exposure into several businesses. A good business analyst can be judged better by the various analytical skills, soft skills like inquisitiveness, comprehension skills, etc, his communication skills and his personality traits like perseverance, pleasing behavior, etc rather than just the domain in which the BA has worked or years of experience in a particular domain.

About Author:
Onkar Lalla is a Consultant and an important part of the Systems Plus Pvt. Ltd.  think tank. Within Systems Plus, he actively contributes to the areas of Technology and Information Security. He can be contacted at onkar.l@spluspl.com

Tuesday, 16 July 2013

GENERICS


This Feature was introduced In C# 2.0

What is a Collection?

A collection Sometimes Called as Container is an object That Groups multiple Elements into a single unit

So Why use collections 

Collections Reduces Programming Efforts as it helps provides you with useful data structure and algorithms

Consider an example of a stack and the two methods of push and pop in .net in C# 1.1 it provided you a stack of an object type as object is the base class in C# you were able to put anything into the stack See the Example Below:

public class Stack
    {
        object[] Items;
        public void Push(object item)
         {...}
        public object Pop()
         {...}
    }

Stack stack = new Stack();
stack.Push("One");
stack.Push("Two");
string number = (string)stack.Pop();

The problem here was that in case of value types you need to box it to object type 
and then again unbox it boxing and unboxing had a performance issue but not only that

Consider the below example:

Stack stack = new Stack();
stack.Push("One");
stack.Push("Two");
int number = (string)stack.Pop(); //This will throw a runtime exception

To overcome this problem there was a solution to create a Specific type Stack

For eg:IntStack to store int and String stack to store string and so on 
But the problem here was that if i had to change the algorithm or if there were any 
error in the algorithm i had to make this change in all the Class files which would be tedious and  repetitive work So Generics Were introduced

Generics allows you to define type-safe classes without compromising type safety, performance, or productivity.
for example consider the example given below

  public class Stack<T>
    {
       T[] Items; 
       public void Push(T item)
       {...}
       public T Pop()
       {...}
    }

Stack<int> stack = new Stack<int>();
stack.Push(1);
stack.Push(2);
int number = stack.Pop();

so now when you are creating a stack you need to specify the type both when declaring the variable and when instantiating it:  
Stack<int> stack = new Stack<int>();

so in the above example the generic T is replaced with the type specific data type and also you can reuse the code for any other type and if you need to change the algo you had to now change in at just one place

But now if you consider generics There is still a constrain as in generics the type is defined at compile time so if i have to do some manipulation over it it used to provide me an error
see the below example 

public T Find(K key)
{
       Stack<T> current = null;
       while (Items != null)
        {
            if (Items == key) //Will not compile
            {
                current = Items;
                break;
            }   
        }
        return current;
 }

because it does not know weather the type specified by the client supports the equals operator now in order to solve this problem you had to specify it like this

  public class Stack<T> where T : IComparable
    {
       T[] Items; 
       public void Push(T item)
       {...}
       public T Pop()
       {...}

public T Find(K key)
{
       Stack<T> current = null;
       while (Items != null)
        {
            if (Items == key) //Will not compile
            {
                current = Items;
                break;
            }   
        }
        return current;
 }

    }

just as a class can be of generic you can also create a method as generic consider the 
example below

public void MyMethod<X>(X x)
   {...}

About Author:
Steven Pinto is technology geek and loves to write on technology. He works in Systems Plus Pvt. Ltd. and actively contributes to technology.
To more of interesting topics written by Steven, follow http://mad4teck.blogspot.in/

Friday, 12 July 2013

What is Near Shore

There are many companies who go for outsourcing to meet their operational needs by outsourcing their contact center operations to onshore, offshore, or near shore locations. The decision on which type of outsourcing to choose is based on cost comparisons, proximity to their business location, and language or cultural considerations. It would be helpful to begin by defining the terms “onshore”, “offshore” and “near shore”.

Onshore Outsourcing: Outsourcing operations of the company to another company located in the home country or region. Companies can reduce labor costs somewhat and benefit from highly skilled labor with little or no language or cultural barrier, but the cost of such operations is high compared to offshore or near shore locations.

Offshore Outsourcing: Outsourcing the operations of the company to other companies that are located in a foreign country, and most likely have a different language and culture. Offshore outsourcing offers benefits like higher cost savings and access to highly skilled labor.

Near Shore Outsourcing: Outsourcing the operations of the company adjacent or nearby country having similar culture and language skills. Near shore outsourcing offers some cost savings over onshore and has the added benefit of proximity for more frequent site visits, while retaining a highly skilled labor pool.

Near Shore outsourcing is the practice of getting work done or services performed by people in neighboring countries rather than in your own country. There are many companies in the United States who outsource work to Canada and Mexico as they are geographically near to them and they get the resources required from there. The travel and communications are easier and less expensive when geographic proximity is less and there are likely to be at least some similarity between the cultures, and people are more likely to speak the same language.

Near Shore has a great advantage in reduced cost and reduced risk compared to the projects given to the companies in distant foreign countries i.e. offshore outsourcing. Most of the developed countries prefer to provide their outsourcing projects to the organizations in neighboring countries as they find it easier and less expensive. They desire to go for near shoring as the cost effective structures are available nearby than depending on offshore countries for their works done.

The near shore outsourcing has lot of benefits compared to the offshore outsourcing. The Outsourcing solutions provided by the near countries to their clients will not have much difference in their time zones and thus they can provide business process outsourcing services in the same time zone as their customers do. It helps to avoid problems evolve due to language, culture, legal affairs, infrastructure and technology. These benefits represent the most elemental form of value proposition for a near shore offering, since they are seemingly available for every company that establishes in a near shore location, and for a seasoned global sourcing professional can be easily justifiable. Thus, it is understandable to see many players jumping into the near shore bandwagon and claim the near shore value proposition.

About Author:
Shailesh Nambiar is consultant and part of Systems Plus Pvt. Ltd. He actively contributes to information technology and VMO. He can be contacted at: shailesh.n@spluspl.com

Thursday, 11 July 2013

Analyzing Application Performance in Visual Studio

Performance tools in Visual Studio Team Edition allow developers to measure, evaluate and target performance-related issues in their code.

These tools are fully integrated into the integrated development environment (IDE) to provide a seamless and approachable user experience. The performance tools support two methods of profiling: Sampling and Instrumentation.


The process of profiling an application is straightforward. You can begin by creating a new performance session. 
In Microsoft Visual Studio Team Edition for Software Developers, you can use the Performance Session Wizard to create a new performance session. After a performance session ends, data gathered during profiling is saved in a .vsp file. You can view the .vsp file inside the IDE. There are six views available to help detect performance issues from the data gathered. Performance tools can be easily used from the command-line. This allows users the flexibility of running these tools from the command-line or using them to automate tasks that use script.

Performance Session

When you use the Performance Profiler, you create a Performance Session, which contains the configuration data for collecting performance information and the results of one or more profiling runs. After you create a performance session it appears in the Performance Explorer window.
  1. The name of the profiling session.
  2. The Targets folder shows the projects or binaries that are profiled in the session.
  3. The Reports folder contains the profiling data files from one or more collection runs. You can click a file name and select views of the performance information such as functions calls, memory allocations, and details of specific functions. Each views are displayed in the main Visual Studio window.

Sampling Method

Sampling is a statistical profiling method that shows you the functions that are doing most of the user mode work in the application. Sampling is a good place to start to look for areas to speed up your application.
At specified intervals, the Sampling method collects information about the functions that are executing in your application. After you finish a profiling run, the Summary view of the profiling data appears in the main Visual Studio window. The Summary view shows the most active function call tree, called the Hot Path, where most of the work in the application was performed, this view also lists which functions were performing the most individual work, and provides a timeline graph you can use to focus on specific segments of the sampling session.

Prerequisites for profiling

These are a few things that you can do before you start profiling to make sure that you do not encounter unnecessary problems.

a. Run as administrator

If you are not an administrator on the computer that you are using, you should run Visual Studio as an administrator to make sure that you have the permissions that are necessary for some of the features in the profiling tools. To do this, click the Start button, locate the Visual Studio application icon, right-click the icon, and then click Run as administrator.

b. Set the active build configuration to Release

Debug builds insert additional diagnostic code into your application and do not include optimizations that the compiler performs in release builds. Profiling the release version of your application provides more accurate data about the performance of your application. To change the active configuration, on the Build menu click Configuration Manager and in the dialog box, under Active solution configurations, select Release.

c. Get Windows symbols files

If you profile code that calls Windows functions, you should make sure that you have the most current .pdb files. Without these files your report views will list Windows function names that are cryptic and difficult to understand.

Create Performance Session for ASP.NET Applications

You can use the Performance Wizard to create a performance session for an ASP.NET application. A performance session can be created with or without opening a project in Visual Studio.

To create a performance session for ASP.NET application

  1. Open the ASP.NET Web project in Visual Studio.
  2. On the Tools menu, point to Performance Tools, and then click Performance Wizard.
  3. In the Which of the following available targets would you like to profile? drop-down list, make sure the current project is selected, and then click Next.
  4. Choose Sampling or Instrumentation to specify a profiling method, and then click Next.
  5. Click Finish.
A performance session is created for the ASP.NET application.

To create a performance session for ASP.NET application manually

  1. On the Tools menu, point to Performance Tools, and then click Performance Wizard.
  2. From the Which of the following available targets would you like to profile? drop-down list, select Profile an ASP.NET application, and then click Next.
  3. In the What local URL or Path will run your web application box, enter the URL, and then click Next.
    Important: 
    For a server (IIS) based Web site, enter a URL such as, http://localhost/MySite/default.aspx. This causes the ASP.NET application on the local computer at the application root of MySite to be profiled, and the page default.aspx on that site to be launched in Internet Explorer to start the session.
  4. For a file based Web site, enter a path such as, c:\WebSites\MySite\default.aspx. This causes the ASP.NET application located at c:\webSites\MySite to be profiled and the page http://localhost:nnnn/MySite/default.aspx to be launched in Internet Explorer to start the session.
  5. Choose Sampling or Instrumentation to specify a profiling method, and then click Next.
  6. Click Finish.

Profile a Web Site or Web Application Using the Performance Wizard

You can use the Performance Wizard to collect performance data for an ASP.NET Web application. You can profile a Web application that is open in Visual Studio, or you can profile an ASP.NET Web site that is located on your local computer and not open in the Visual Studio IDE.

Depending on User Access Permissions settings that an administrator has made available, an individual user might or might not have security permission to create a profiler session on the computer that hosts the ASP.NET process. The following examples illustrate possible differences among users:
  • Some users may access advanced profiling features when the Administrator has set the driver and service to start.
  • Domain users may access sample profiling only.
  • Some users my deny access to profiling to all other users.

To profile a Web site project

  1. Open the ASP.NET Web project in Visual Studio Premium or Visual Studio Ultimate.
  2. On the Analyze menu, click Launch Performance Wizard.
  3. On the first page of the wizard, select a profiling method, and then click Next. Note that the concurrency visualizer profiling method is not available for web applications.
  4. In the Which application would you like to target for profiling? drop-down list, make sure that the current project is selected, and then click Next.
  5. On the third page of the wizard, you can choose to add tier interaction profiling (TIP) data, data from the JavaScript running in the Web pages, or both.
    • To collect tier interaction, select the Enable Tier Interaction Profiling check box.
    • To collect data from the JavaScript running in the Web pages, select the Profile JavaScript check box.
  6. On the third page of the wizard, you can choose to add tier interaction profiling (TIP) data, data from the JavaScript running in the Web pages, or both.
  7. On the fourth page of the wizard, click Finish
  8. A performance session is created for the ASP.NET application, and the Web site is started in the browser. Exercise the functionality that you want to profile, and then close the browser

To profile a Web site without opening a project in Visual Studio

  1. Open Visual Studio Premium or Visual Studio Ultimate.
  2. On the Analyze menu, click Launch Performance Wizard.
  3. On the first page of the wizard, select a profiling method, and then click Next.
  4. On the second page of the wizard, select the Profile an ASP.NET or JavaScript application option, and then click Next.
  5. In the What URL or Path will run your web application box on the third page of the wizard, enter the URL to the application home page, and then click Next.
    1. For a server (IIS) based Web site, type a URL such as http://localhost/MySite/default.aspx. This causes the ASP.NET application on the local computer at the application root of MySite to be profiled, and the page default.aspx on that site to be started in Internet Explorer to start the session.
    2. For a file based Web site, type a path such as file///c:\WebSites\MySite\default.aspx. This causes the ASP.NET application located at c:\webSites\MySite to be profiled and the page http://localhost:nnnn/MySite/default.aspx to be started in Internet Explorer to start the session.
    3. For external sites that you wish to collect JavaScript data on, type the URL, for example http://www.contoso.com.
  6. On the third page of the wizard, you can choose to add tier interaction profiling (TIP) data, data from the JavaScript running in the Web pages, or both.
    1. To collect tier interaction, select the Enable Tier Interaction Profiling check box.
    2. To collect data from the JavaScript running in the Web pages, select the Profile JavaScript check box.
  7. Click Next.
  8. On the fourth page of the wizard, click Finish.
  9. A performance session is created for the ASP.NET application, and the Web site is started in the browser. Exercise the functionality that you want to profile, and then close the browser.
The profiler generates the data file and displays the Summary view of the data in the Visual Studio main window.

About Author:
Dhairut Dholakia is technology lover and is important part of Systems Plus technology Think Tank. He works in Systems Plus Pvt. Ltd. and actively contributes to technology. He can be contacted at: dhairut.d@spluspl.com