Customised Operations For Customsied Research At Casro
Upcoming SlideShare
Loading in...5

Customised Operations For Customsied Research At Casro






Total Views
Views on SlideShare
Embed Views



2 Embeds 10 5 5



Upload Details

Uploaded via as Microsoft PowerPoint

Usage Rights

© All Rights Reserved

Report content

Flagged as inappropriate Flag as inappropriate
Flag as inappropriate

Select your reason for flagging this presentation as inappropriate.

  • Full Name Full Name Comment goes here.
    Are you sure you want to
    Your message goes here
Post Comment
Edit your comment
  • Question…do we need to introduce ourselves? Or will they be providing an introduction for each presentation? Might need to add this in. Sameer>> Think they should but think we can introduce us and our company name incase they don’t This morning Sameer and I would like to talk about an ah-ha discovery we made on some recent projects…a moment that brought to light the need to take a different approach to how Operations has traditionally supported market research projects.
  • There is a lot information to cover in the hour…this screen provides a basic outline of how we will be proceeding.
  • I believe we can all agree with what is portrayed on this screen. Over the past few years, market research projects have become overall more challenging. Clients are demanding more and more, AND they want it quicker and at a lower cost. Many of them are expecting complete transparency as well. For example, not just receiving the data collected from a phone interview, but an audio file of each interview so they can ‘hear’ the respondents answer, voice inflection, interviewers quality, etc. In our experience, the technology that support MR operations is still pretty standard. Meaning, most market research companies are still approaching study execution in the same manner…utilizing the same technology. The tools have not necessarily kept up with the demands of the clients. Sameer>> In contrast we can see how other fields like Business Intelligence have developed so much w.r.t the technology over the years.
  • The more challenging projects of today have certain characteristics… * High volume * Unique requirements such as recording phone interviews and NOT having interviewers type open end responses…then having respondent comments transcribed, or extremely quick turn around time for respondents data that meets specific requirements, etc. * Not just data for a deliverable…in addition an audio file, transcribed comments, results in tables but also posted in a portal, etc. Programs which run for an extended period of time…1,2,3 years with deliverables required on a consistent basis…daily is becoming more prevalent. Supporting these projects has introduced new challenges…
  • Some of the biggest challenges are… How do we meet the requirements of the project, yet stay cost competitive, particularly with ever tighter turn around time and a growing number of deliverables? How do we make ourselves more attractive than our competitors? Why should they choose us? And of course most importantly, how do we do all this AND keep our margins? Sameer>> I am sure you will agree that in today economic scenario, these challenges have exemplified
  • The traditional approach is to rely on our tried and true processes and technology…the ‘old reliables’ as it were. Makes sense…They are comfortable…well traveled…and have a proven track record and people are comfortable handling it. However…
  • However, this approach often leads to over runs and the need for additional resources, particularly for these more challenging projects. For these projects, what we need is a ‘Customized Approach’… a solution that is best suited to support the project requirment at hand.
  • This leads us to the purpose of our presentation…
  • We would like to propose a conceptual model for customized operations…a model that will help us Identify whether a project requires customized operations…are there deliverables which current operations cannot support, or support in a time/cost efficient manner Help align operational processes and technology with the project requirements and maximize time and cost efficiencies Help with resource planning, identifying KPI’s and metrics for tracking, and making sure they are aligned with long term organizational goals, and Best of all, the model will aid in determining the total cost of operations for the different approaches and identify the best approach overall Will our tried and true methods fit the bill and keep us cost efficient? Or do we need to modify the tried and true to meet the project requirements? Or will modification allow us to realize a higher margin?
  • We’d like to share with you one recent experience we had the privilege to share…one we hope will shed light on what we’ve been talking about up to this point…
  • A case study if you will
  • Synovate was recently took on a customer satisfaction follow-up program for a car company. The scope of the project included… Taking place over an extended period of time…3 years Extremely high volume for a phone project with around 5,000 interviews completed EACH day, Monday through Saturday 100% of the interviews were to be recorded…but respondent comments were not to be typed in by the interviewer…a qualitative approach within a quantitative methodology All respondent open end comments were to be transcribed after the interview was completed…then the individual comments coded to allow for quantitative analysis Turn around time was extremely tight… For any interviews that were flagged as ‘Hot’, an email containing the structured data from the interview; transcribed comments; and coded data would be sent to the client within 30 minutes. A ‘Hot’ flag indicates that the respondent had an important issue during their interaction with the client…and they want a contact from that client within 48 hours. All remaining interviews were to be completed, transcribed, coded, and ready for upload to the client website by 6am the following morning. To put some light on this requirement…interviews are conducted between 4:30 and 10:30 CST. All these deliverables make Operations 100% transparent. Every aspect of the projects is visible!
  • Honestly, this project introduced just about every challenge we identified earlier… As I just mentioned…very tight turn around. Logistical challenge of file transfers across multiple locations with multiple times, and then consolidation at the end Very stringent quality requirements…95% per complete How to schedule and plan resources given the very small execution window And, finding an outside vendor to support the transcription and coding piece. We quickly realized that it would not be cost efficient to support these pieces given the timing constraints. This is the point where I met Sameer…as Datamatics was our partner of choice in this venture. So, how do we approach this? How do we meet all these challenges in the most efficient manner?
  • First, we took a look at our processes and technology as they stand. It is important to point out that our processes are designed around supporting custom market research projects…which means that the scope and requirements of each project we execute changes from project to project, wave to wave. Lets take a look at what the typical approach to this project might look like…This is a very basic diagram of how the process or data would flow following using existing processes and technology. Let’s discuss each piece individually…
  • First, interviewing begins at 4pm. As interviewers complete a call, an audio file in .wav format is automatically created and saved on the CATI dialer. Audios files will then be classified as per type of interview and language (hot/non hot) and only then be ready to send to Partners for transcription and coding
  • The resulting audio files would need to be batched and transferred to an internal server which our transcription/coding partners could access. Typically, this process would take place every 30 minutes and completed by FTP. One exception would be hot alerts which need to be processed every 30 mins. Transfer of these files would need to happen immediately. Sameer>>It is important that Synovate and the vendors to be completely in Sync to ensure successful transfer and processing
  • Next, our partners need to download the files so that transcribing and coding can take place. The files would first need to be transcribed and quality checked by one team and then sent to the coding team. Next, The transcription and coding output would need to be combined into one file and batched for uploading to Synovate servers at regular intervals…much like our downloading to the vendors.
  • Finally, once the transcription and coding data is received, it needed to be consolidated with the interview structure data and audio file at an individual respondent level and uploaded to a website for client access and analysis. All of this would need to be completed within 30 minutes for Hot calls, and by 6am for all the remaining calls.
  • Well, that is what we ideally would have done. However we chose a different path
  • We needed to ensure that along with achieving the strict turn around time and quality standards, we had to run this project profitably over a period of three years
  • So we put our heads together and started brain storming!!
  • We started by doing a thorough understanding of project requirements and then designing an efficient process to meet these requirements. Once we had a efficient process in place, we evaluated different technologies and found the most appropriate technology that fit the process and gave us increased margins by bringing down total cost of operations
  • So after Brain storming across two continents, we came up with this…. Having this process in place was half the work done, as we had a clear understanding on ‘What’ we needed to do. The other part was ‘How’ we were going to automate. The key areas that were highlighted after this exercise were The file naming convention - the dialer assigns to the automatically generated .wav files a default name which is meaningless in terms of an individual project. However for this project the file name needed to drive where the audio file was routed, so modification to the filename was required. 2. Another important point is that the audio files have to be removed from the dialer at the end of each day or else the performance of the dialer is negatively impacted. The standard process for removing audios from a dialer is for a person to manually batch the audios and move them to a predetermined drive/server for access. The number of projects requiring audio files has not been high enough to create an automated process. However considering that this project ws going to run six days as week for next three years, we relying on manual method would have been inefficient. This has to be automated. 3. Sending files to partners – considering the tight turn around time, if we batched and sent files every 30 minutes it would be impossible to meet the client requirement for Hot calls as well as complete transcription . As a reminder, for Hot calls, an email containing all the deliverables (structure data, transcribed comments, and coded data) has to be sent to the client WITHIN 30 MINUTES! It would also not be efficient for our partners to receive a huge batch of files in one go and then work their way through it. It would cause bottleneck when downloading files. To further ensure there is no bottleneck in transfer of files we decided to convert wave files to mp3 which on this project ended up being 45% smaller in size. Also, transferring the files to an internally owned server puts the ownership of reconciling the audio with associated transcription/coding results in Synovate’s hands. We wanted to ensure that the partners are responsible for ensuring that for each audio file received they have provided the associated transcription/coding results. Hence it was decided to post files in partners FTP servers. 4 Consolidation – We needed to ensure that files sent to partners are combined with the output sent by partners and uploaded within a specific window to meet client requirements. This would be an tedious process and therefore needed to be automated.
  • Now we can look at ‘How’ we achieved what we specked out in our process flow.. We will look at each element individually as it involved automation in different areas at Synovate as well as the partners. The very first key area identified was identifying the different type of file and routing it to the appropriate location. This was solved by having the filenames contain key information that will help in routing. This helped if the file was a hot alert, english or spanish, long or short. We wrote a program to pull the key information from the data file and insert it in the filename in fixed locations. A program was written to continuously covert the wave files to MP3 format. This helped in making the transfer faster as it saved bandwidth. The routing was handled by another program that would ensure files were sent to an appropriate vendors based on the filenames. So by doing this we eliminated the need to send files in batches, the files were continuously (every minute) converted from wave to mp3 format, move to appropriate location based on the files name. All these processes were run concurrently and not in sequence.
  • Once the files were ready to be uploaded, we needed to be sure that the transfer of files were not human dependent. A software was purchased that would continuously upload the files to partners. This software is commonly used in the newspaper industry and we found this suiting our requirements perfectly! It also generated a status report that was sent to all project managers. To maintain confidentiality, secure FTP was used that encrypted the files and it was hosted on secure servers. Since transfer of files was the backbone of the entire process, we had back-up ftp servers and connectivity. The internet connectivity was set-up to switch automatically, if one went down.
  • The files needed to be processed continuously for transcription as well as coding to meet the 6:00 am deadline. The transcription software had a built in module to automatically download the audio files. It also had a module for file allocations, so the files once downloaded were not physically moved, but they could only be accessed only through the software. This eliminated the effort required to send files to different resources and also eliminated the possibility of file loss. The coding software could easily interact with our transcription software as they were built on the same platform. The transcription and coding software could exchange information seamlessly, so files after transcription files could move directly into coding.
  • The coding software would export data as per the project requirement, i.e. one file containing the transcription text and open ended codes for each audio file. Once files were exported to a specified location, an upload module would pick them up and upload it to FTP site. Before uploading it would also do an integrity check to see if file was in the correct format and it did not have any information missing
  • A software was deployed at Synovate to download all files that were uploaded by the partners. Another software consolidated the files that were downloaded with the audio files. It ensured that all three pieces of information existed– structured data, transcription+coding file, audio file. This consolidated data was automatically uploaded to a webiste for the client.
  • All manual activities apart from cati interviewing, transcription and coding were automated. This ensured that project could be could run each day without any human dependency or intervention for manual tasks so that Eleanor and I could sleep peacefully every night
  • The software we developed, had built-in modules to track productivity at each stage. For example – it kept track of the time when audio file was downloaded and when it was allocated. It kept track of time taken be each transcriber for each file he transcribed along with the duration of the file. Similarly productivity was also tracked for coding. Once we had analyzed the data we realized that we were spending lot of time in proof reading the transcribed data. Proof reading is a Quality process of transcription that involves comparing the audio file with the transcribed text. Since the proof readers were anyway going through the transcript and understanding what respondent was saying, we decided to train our proof readers in coding and integrate the transcription and coding software. The improved the efficiency by 30%. We also realized that the allocation of files was key to ensure good productivity. We modified our transcription software that classified audio files into different categories based on duration of audio. We could then route the bigger files to more experienced transcribers and smaller files to less experienced. It also gave us an estimate of how much work was pending at any point of time as we needed to check which category files were pending transcription or coding.
  • The next few slides have snapshots of the software we have developed The one is the allocation module of transcription software. The files are auto-downloaded and placed in this window. The supervisor can allocate files to different resources present on that day (see on extreme right). On the left had side you will see various modules. Each module can be customized as per requirement so it is not hard coded
  • This has snapshot of our coding software. It has features like built in validations customized for each project. It interfaces with the transcription software and also allows for proof reading and coding to be done simultaneously. The validations helped in improving coding quality
  • We had half-hourly and daily productivity reports which were shared with the executives working on the project real time. The supervisor would monitor productivity real time and take corrective action if we were missing our hourly targets.
  • Similar for coding as well
  • As for the Benefits, we managed the client timelines and quality requirements. Due to the automation of manual tasks and reports, the project was well managed. Due to automation, most operations were running on its own so were not replying on people and hence no inconsistency of operations Transparency helped us improve our processes.
  • So when we analyzed the data we saw savings on resources as well as software. Estimated saving was 630,000 for three years Employee costs at Synovate - $300000 Employee costs at Datamatics - $200000 Software licensing costs - $130000 Total costs - $63000
  • We managed to ramp-up within a month without any quality issues.
  • Our quality ratings improved over a period of time to 95% accuracy
  • Not following the norm and using standard processes helped us achieve substantial cost savings Having real time metrics in place is very important for process re-engineering initiatives For Large/Long term projects, the transaction costs for of-the-shelf software and resource cost, contribute substantially to overall cost of operations. This two areas need to be looked closely. It is sometimes better to go in for one-time technology cost that will off-set the above costs in the long term. As per our experience, using our own technology gave us more control over the project.
  • After our experience with this project, we thought of developing an conceptual model that will give serve as a solid approach to ensure operational efficiency. It could be used by anybody to customize their operations for similar large or long term projects. The model suggests a stepladder approach, with series for activities in a particular order. The outcome for one activity is the input for next activity and hence a planned order is important.
  • For any long term/large projects, the very first step is to understand and document the project scope and project requirements. The different parameters that need to be considered are Inputs/outputs – we need to first understand the various inputs and their formats, the intermediate/final outputs expected. Internal/external connections – this covers the transfer of information, between different groups within the company and sources outside the company i.e. clients, client partners, our own partners. A understanding of technology used by external partners are essential at this stage. For example the way Synovate and Datamatics got an understanding and aligned their respective technology accordingly to work with each other. Procedural complexity – The complexity of the process based on the number of inputs and outputs required. The roles of the different internal or external groups involved and the number of changes expected. TAT – the turn around time for the project Quality – the quality standards expected by the client. This will have a bearing on the process design as it will require a more stringent QC process. If you recollect our case study, we had an 100% qc check for transcription. The idea behind this activity is to get all information on the above parameters and prepare a project document. Depending on your specific project requirements there could be other parameters added to the list but essentially this is an information gathering stage and we should have all information and project requirement clearly specked out. The next step is very crucial – it is designing a efficient process that takes into account all parameters of the project document. The process will ensure that the project will be run on optimum efficiency and Turn around time. The end result is this stage is Process flow diagram which will map the entire process from receipt of inputs to final output delivery & Data flow diagram which will show flow of information files between different processes/activities This is followed by Gap analysis to compare the designed process with existing process flow and identify areas that will need different solutions as compared to existing ones available. Once we have fixed on the processes, we need to see what technology is best suited for the process. This could be tweaking existing technology or building new technology. This is called technology process alignment. We will be taking more about this further in the presentation It is also important that the technology that is deployed should be aligned to the nature of work and scalable for long term organization needs. The scalability is very important so the automation tools can be used for other projects as well with some tweaking. We will look in to this aspect in detail in the following slides The last step is to do a simulation exercise before launch of the actual project
  • Now lets look in detail how we can get the technology and process aligned. Once we have established an efficient process for a project, the next step is to align the technology that suits the process. The idea is to have different technology options and finally decide on technology solution that will bring savings at ‘Total cost of Operations’ for the entire duration of project. We can use the following parameters for deciding the best technology solution for the process To decide on different technology options, we can use the following parameters 1. Fitment of technology with process and data flow – The technology should be able to support the processes and workflow we have defined. There could be technology solutions already available within the organization which can be tweaked to suit our purpose. Else we need to look for solutions outside, and we should be open to look for solutions available outside MR domain. 2. Productivity – We need to determine metrics for estimating productivity and use it for making estimates on the different technology options available. For example, when we decided to use fingerpost, we estimated the productivity for collection and distribution of files to increase by XX % and by interfacing transcription and coding software we estimated an increase in TAT by 15% 3. Training – If we are deploying new technology, a training plan is needed to equip staff of handling the technology. 4. Monitoring performance – determine how productivity and quality will be monitored. If possible, the technology deployed should give productivity reports. The metrics are very important to check the effectiveness as well as leave scope for further process re-engineering 4. Resource planning – Based on productivity estimates, determine the total number of resources required 5. Costs – Last but the most important. What would be the total cost of operations. This includes costs for buying/tweaking technology and resource costs.
  • It is also very important that the technology that is deployed is aligned to long term organization needs. It should not be used for just one project and then archived! The following parameters can help the check alignment Adaptability – we should determine how adaptable the proposed technology is for future projects. We can also check if any of the current projects could benefit from the proposed technology Flexibility – It should have room for tweaking and not entirely hard coded. This will ensure that if there is any change in scope than it can be suitably modified to suit the changed requirements. Scalability – It should handle increase in scale of operations. There should not be restrictions on number of users, or number of data inputs/outputs Scope of further process re-engineering – this is very important else technology will become a straightjacket for the process. Any process re-engineering effort will not be possible. This happens when technology is hard coded and not parameter based or module based. We can refer to the case study where we combined proof reading and coding and could do it as the software could be integrated.
  • Finally, we would like to summarize by saying, Optimum use of technology is only possible if you have an efficient process. For ensuring an efficient process, it is important to first do a complete understanding of the project requirements. Having a process and technology in place is only half the battle won. The metrics would win you the other half. Process re-engineering is very important to further refine the effectiveness of the process and improve your profits Finally, it is important to partner with your service provider. This project was a success due to partnership between Synovate and Partners/Datamatics with each bringing different expertise to the table. Reducing the total cost of operations should be a joint effort.

Customised Operations For Customsied Research At Casro Customised Operations For Customsied Research At Casro Presentation Transcript

  • Plan for the Hour
    • Introduction
    • Objective of the paper
    • Customized Operations Model
    • Our Experience
    • Conclusion
    • Introduction
    • Projects are becoming more challenging
      • We are moving from a world of ‘OR’ to a world of ‘AND’
      • Of course, costs of operations today has to be less than yesterday!!
    • Technology in MR operations is still pretty standard (compared to the pace outside of MR)…
    Market Research Today
    • High volume
    • Unique requirements that need different approach
    • Multiple deliverables
    • Monthly/weekly/daily
    • Extended period of time
    Projects Today
    • How do you meet requirements and remain cost efficient and cost competitive?
    • How can you differentiate from your competition on turn around time and quality?
    • How can you maximize project margins and ensure that the project will have long term savings?
    • Try to fit these large/unique projects into current processes and technology
    Typical Approach
    • One-size-fits-all approach causes over runs leading to extra efforts & resources
  • Plan for the Hour
    • Introduction
    • Objective of the paper
    • Customized Operations Model
    • Our Experience
    • Conclusion
    • Objective of the paper
    • Propose a Conceptual Model for Customized Operations that will help in:
      • Determining suitability of a project to be adapted for customized technology operations
      • Maximizing Process-Technology alignment for that project
      • Decision-making for resource planning, KPI/metrics, & alignment with long term organization goals
      • Determining total cost of operations for different approaches and suggest the best approach with minimum cost
    Objective of the Paper
  • Plan for the Hour
    • Introduction
    • Objective of the paper
    • Customized Operations Model
    • Our Experience
    • Conclusion
    • Our Experience
  • Customer Satisfaction Project
    • Customer follow-up program for a car company
    • Three year program with estimated 1.5 million interviews per year; 120,000 per month; 5,000 per day
    • 100% of interviews to be recorded - respondent comments not typed in by interviewer (Qual-Quant methodology)
    • Open end comments to be transcribed then coded
    • Tight turn around time requirements
      • Email sent to client within 30 minutes containing transcribed and coded information for any interviews flagged as Hot Alert
      • Remaining transcribing and coding completed by 6am following morning
    • Complete transparency – data, recording, resulting transcription and applied codes visible to client at respondent level via website
    Scope of the study
    • Turn-Around-Time – 30 mins for Hot Alert (4% of files) and all remaining files processed by 6:00 am
    • Logistics – large number of files to be transferred across multiple locations and then consolidated
    • Quality – stringent quality norms to be followed
    • Resource Planning - small time window to execute the entire process
    • Outsourcing – transcription not a service provided internally and timing constraints
  • CST  6:00 am CST  4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data Upload to website Existing Technology
    • CATI Dialer automatically generates audio wave file per interview
    • Files classified per type of interview and language
    CST  6:00 am CST  4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data Upload to website Existing Technology
  • CST  6:00 am CST  4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data Upload to website Existing Technology
    • Every 30 minutes, files batched and sent to partners for transcribing and coding
    • The file transfer is by FTP
  • CST  6:00 am CST  4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data Upload to website Existing Technology
    • Audio Files are downloaded by partners
    • Team transcribes and codes using software available
    • Transcription and coding data combined
    • At periodic intervals, the transcription/coding data file is batched and uploaded to Synovate servers
  • CST  6:00 am CST  4:00 pm Audio file per interview File Transfer to Partners Interview start Transcription & Coding Consolidation Transcription+ coding data Upload to website Existing Technology
    • Once files downloaded by Synovate, they are matched back with other structure data and audio file
    • Files are then uploaded batch-wise to website for client
  • Wait a minute – Was that the BEST way??
  • Can we make it better, i.e. more profitable operations?
  • Of course we can and we did!!
    • Followed a conceptual model for Customized research which focused on
      • Establishing an efficient Process after understanding project requirements
      • Getting the Technology to fit the Process
    How you ask??
  • The “Out of Box” Process
    • Continuous feed of files (no human intervention)
      • Automated file transfer and renaming process
      • Program written to continuously convert to mp3 to save bandwidth
      • Program written to identify and route files appropriately
    CST  6:00 am CST  4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
    • Transfer of files
      • Software purchased for continuous upload/download files to partners
    • Security of files
      • Secure FTP purchased and hosted on secure servers
    • Backup plans if file transfer fails
      • - Backup ftp servers
      • - Backup internet connectivity
    CST  6:00 am CST  4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
    • Continuous processing of large number of files
      • Developed transcription software to auto download and distribute files to transcribers
      • Developed web enabled coding software
      • Both software track hourly productivity
    • Transcription and coding to be seamlessly interconnected
      • Common platform for both software so they can talk to each other
    CST  6:00 am CST  4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
    • One file per interview
      • The coding software combines transcription text with open ended codes
    • File integrity & Upload
      • Module to check file integrity before upload
      • Files automatically uploaded as soon as responses were coded
    CST  6:00 am CST  4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
    • Continuous downloading of structured data
      • Software used to auto download data
    • Structured data needs to be consolidated with audio file
      • Program written to check file integrity and consolidate
    • Upload consolidated data back to client website
      • Automated program to upload audio file and data to website
    CST  6:00 am CST  4:00 pm File per interview Interview start Transcription & Coding Download individual files and Consolidate Continuous upload - Transcribed + coded data Upload to website Continuous transfer to partners Process Technology Fit File conversion & routing
  • Process Technology Fit In Short - Automated all manual tasks
  • But the story does not end here….
    • Software provided real time metrics to measure efficiency at each stage
    • Further Process Re-engineering possible
      • Integration of transcription and coding software
      • Classification of audio files into buckets which helped in allocations
    Few weeks later
  • Software Snapshot
  • Software Snapshot
  • Productivity Tracker
  • Productivity Tracker
    • Quality & TAT – As per client requirements
    • Well managed
    • Process and technology dependent – Human intervention rarely required
    • Transparency – Real time metrics available for every activity
    • Resources – 20% less number of resources required
    • Software – 40% savings on licensing/transaction costs
    • Total Cost of Operations – Estimated savings of $630,000 over three years
  • Managed increased volumes
  • Improved Quality
  • So what did we learn from this?
    • It pays to think out of the box and design your own process
    • Metrics is key requirement for process reengineering
    • For Large/Long term projects look closely at existing software (transaction) & resource cost.
    • Using your own technology can give you more control over your process and metrics
  • Plan for the Hour
    • Introduction
    • Objective of the paper
    • Customized Operations Model
    • Our Experience
    • Conclusion
    • Customized Operations Model
  • Organization Process Automation Simulation Long term / large projects Project Document Processes Understand the project scope & requirements Design process as per project requirement Technology & process alignment Technology & organization alignment Conceptual Model
    • Inputs/Outputs of the process
    • Internal/external connections
    • Roles of different resources
    • Procedural Complexity
    • TAT
    • Quality requirements
    • Process flow diagram
    • Data flow diagram
    • Gap Analysis
  • Technology Process Alignment
    • Parameters used for checking alignment
        • Fitment of technology with process flow and Data flow diagrams
        • Productivity
        • Training
        • Monitoring performance
        • Resource planning
        • Costs
  • Technology Organization Alignment
    • Parameters used for checking alignment
        • Adaptability
        • Flexibility
        • Scalability
        • Scope for further process re-engineering
  • Plan for the Hour
    • Introduction
    • Objective of the paper
    • Customized Operations Model
    • Our Experience
    • Conclusion
    • Conclusion
    • Technology is an enabler not solution by itself
    • Large ongoing projects need a different approach
    • Understanding project score and designing efficient process is first and most important step followed by appropriate technology solution
    • Metrics is key to process re-engineering. The technology solution selected should provide real time data
    • Process re-engineering and automation is an ongoing process
    • Move from outsourcing which is “Lift & Shift” to Smart-Sourcing and partnership with service providers
  • Questions
  • Thank You...