[vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”center” background_image=”349911″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462275948630{padding-top: 150px !important;padding-bottom: 150px !important;}”]

Case Studies

[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”full_width” text_align=”left” box_shadow_on_row=”no”][vc_column][vc_empty_space image_repeat=”no-repeat”][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1661289111108{padding-bottom: 10px !important;}”]

Case Study: Sumo Logic Machine Data Analytics Platform Modernization Using Snowflake

[/vc_column_text][vc_column_text css=”.vc_custom_1661292394552{padding-bottom: 30px !important;}”]

Industry: IT & SOFTWARE

[/vc_column_text][vc_single_image image=”362040″ img_size=”full” alignment=”center” css=”.vc_custom_1661284569062{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077494093{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1661286531066{padding-bottom: 30px !important;}”]Sumo Logic’s vision was to build a scalable data cloud platform to collect, consolidate, cleanse data across ERP CRM, Billing, Marketing, and SUMO platform data to provide a 360-degree view of their business to executives, operational managers, and trusted data to data scientists for predictive and perspective analytics. The key focus area for the initial phase was to build KPIs and Dashboards for Finance, Sales, and Marketing functions.

 

Sumo wanted to eliminate the complete manual excel based reporting process to generate the KPIs and Dashboards for Executive briefing and SEC reporting and to manage day-to-day operations.

 

This initiative was to automate/replace the huge manual (laborious) excel-based reporting to generate a monthly executive summary including Revenue, ARR, Pipeline, Headcount, Opex, Capex, Marketing spends, etc[/vc_column_text][vc_column_text css=”.vc_custom_1464077510987{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1661464444480{padding-bottom: 30px !important;}”]Sumo Logic asked Infometry to redesign and modernize its complicated business model using Snowflake data cloud solutions:

 

• We provided Enterprise Data Strategy and Data Cloud Architecture which involves Snowflake, Matillion, and Tableau Online.

• Worked closely with the infrastructure, security, and IT leadership to procure product licenses, installation, and configuration.

• Integrated Salesforce, NetSuite, Coupa, Zuora, Marketo, Google Analytics, Sumo-on-Sumo, and File Systems like Excel and CSV in near real-time with Snowflake Data Cloud.

• Designed and developed Dimensional Modal, Physical and Semantic layer with business metrics, KPIs, and Snapshots for as-of reporting and analysis.

• Built ARR (Annual Recurring Revenue) framework and modeled the complex logic embedded in excel to derive metrics such as New Logo, Upgrade, Downgrade, Churn, and ARR across multiple dimensions[/vc_column_text][vc_column_text css=”.vc_custom_1661284772117{padding-bottom: 10px !important;}”]

Challenges

[/vc_column_text][vc_column_text css=”.vc_custom_1661464491258{padding-bottom: 30px !important;}”]Sumo Logic needed a complete technology stack, including a Data Cloud platform, Data Integration Tools (ETL/ELT), and an analytical platform for visualization and Ad-hoc analysis, which led to so many challenges, including:

 

• Infometry had to perform data discovery & assessment activities which involved interviewing business leaders, IT, and the Data Security team to understand requirements, current landscape to propose enterprise data strategy, architecture including tools and technologies along with best practices to deliver analytics solutions for Sumo Logic.

• It was challenging for the IT and Data team to support frequently changing business processes and enhancements.

• It also added complexity to acquiring companies and merging data with the Sumo system. The Finance and Sales team needed a snapshot-based historical comparison of hundreds of matrices.

• Lack of documentation on excel-based reporting and needed a reverse engineering effort.

• The business process was complex.

• Operational support needed a workaround due to customization Sumo Logic’s purpose-driven data analytics platform needed modernization to reduce overall maintenance and faster delivery of features with improvement to the customer experience with Tableau & Snowflake Data Cloud.[/vc_column_text][no_button size=”medium” style=”transparent” button_hover_animation=”default” icon_pack=”” target=”_self” text=”Download the Case Study PDF” link=”https://www.infometry.net/wp-content/uploads/2022/08/2022-07-12-Sumo-Logic-Case-study-Machine-Data-Analytics-Platform-Modernization-Using-Snowflake.pdf” color=”#fd7d23″][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1649951297836{padding-bottom: 10px !important;}”]

Case Study: Marketo data replication

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”359790″ img_size=”full” alignment=”center” css=”.vc_custom_1649952642269{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077494093{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1649952005792{padding-bottom: 30px !important;}”]The customer was challenged with data replication using their complex API. Almost all the objects in the database were supposed to be extracted and loaded into the data warehouse using complex API methods. Also, the customer wanted to have a dynamic method to get any new data created at the source. The API has lower call limits and made the bulk data extraction difficult. Also, its multi-step process in data extraction eats up the API call limit.[/vc_column_text][vc_column_text css=”.vc_custom_1464077510987{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1649952045971{padding-bottom: 30px !important;}”]All the required APIs are tested and classified based on the way they are extracted. For objects under each of the classifications, separate code/mapping is created to get the output. The code/mapping is dynamic and holds the parameters in a stage like Snowflake. The parameters are iterated through each API call value one by one. A python code is created which creates the table in snowflake based on the incoming data. Also, the python code extracts data, compiles, and stores it in the S3 bucket in JSON form. A stored procedure is used to copy the files from S3 to snowflake by executing the “copy into” command. It is also used to flatten the JSON data and load it into the respective table and any scheduler can be used to execute the python codes sequentially.[/vc_column_text][vc_column_text css=”.vc_custom_1464077528829{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1649953105863{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

Our Infofiscus Marketo Data Replication solution helped the customer to reduce the complexities and have only a few categories of the APIs. In this case, it was a total of 7 categories. These categories are scheduled to run at different sets of time intervals to prevent hitting the API limitations. The solution mentioned is dynamic and any new data is extracted dynamically if added to the source. The solution is automated end to end and makes the data flow simpler.

[/no_unordered_list][/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462326571539{padding-bottom: 10px !important;}”]

Case Study: Business Intelligence/Data Discovery

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”349916″ img_size=”full” alignment=”center” css=”.vc_custom_1463573115680{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077494093{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462326352714{padding-bottom: 30px !important;}”]The customer was challenged with highly inefficient data collection and manual process across multiple systems to build daily, weekly finance and operational reports resulting in data inconsistency, inaccuracy, incompleteness and operational efficiency. Customer wanted to integrate their ERP, CRM, Supply Chain, Finance, manufacturing and planning application in real-time to begin with.[/vc_column_text][vc_column_text css=”.vc_custom_1464077510987{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462326390220{padding-bottom: 30px !important;}”]Infometry was hired to conduct BI/Data Discovery, Data Architecture review and put together Enterprise Data Strategy and BI Roadmap. Infometry team started with 6 weeks engagement conducting executive management interviews to understand business goals, meeting enterprise application architects to understand business and data flows and performed data discovery using Infometry tools and processes. Infometry proposed the customer with Enterprise Data Integration and Orchestration strategy, ETL/DM vendor analysis and presented a short term and long term Data Warehouse, MDM and Analytics road-map to Customer Data and Analytics needs.[/vc_column_text][vc_column_text css=”.vc_custom_1464077528829{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462326546071{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • Got a clarity on the data flow and data topology
  • Gauge the magnitude of data consistency and data quality issues
  • Identify critical data Integration points and delivery mechanism
  • Create a business case to develop Enterprise Data Warehouse and Data Orchestration platform
  • Scalable architecture for Cloud Data Integration, ETL, MDM and Enterprise Data Warehouse
  • Create a business case for 3rd party data enrichment such as D&B, Address cleansing and Geocodes

[/no_unordered_list][/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” background_color=”#ededed” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462327801715{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Informatica Cloud Data Orchestration Implementation

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”349917″ img_size=”full” alignment=”center” css=”.vc_custom_1463574407306{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077546511{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462326352714{padding-bottom: 30px !important;}”]The customer was challenged with highly inefficient data collection and manual process across multiple systems to build daily, weekly finance and operational reports resulting in data inconsistency, inaccuracy, incompleteness and operational efficiency. Customer wanted to integrate their ERP, CRM, Supply Chain, Finance, manufacturing and planning application in real-time to begin with.[/vc_column_text][vc_column_text css=”.vc_custom_1464077559066{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462327369763{padding-bottom: 30px !important;}”]Infometry built the solution for the entire tools group to provide Informatica as a PAAS that combines EDR application and other data integration services enabling users to self development, execution, and governance of integration workflows among on premise or cloud-based applications as well as traditional and newer data protocols.[/vc_column_text][vc_column_text css=”.vc_custom_1464077600825{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462327397421{padding-bottom: 30px !important;}”]This helped customers to unify cloud data with the rest of the enterprise data and ensure maximum value from SaaS investments. The benefit came from bulk and real-time integration on one platform.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462329690364{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Project Web Access (PWA)-SAP Integration

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”349922″ img_size=”full” alignment=”center” css=”.vc_custom_1463574473628{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077620577{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462327865211{padding-bottom: 30px !important;}”]Customer is a large manufacturing company and supply chain is one of the most important function of their operation. Customer was using multiple tools such as Microsoft Project Server for maintaining the schedules, SAP for generating the manufacturing orders and sequences, SAP for labor and material forecasting and other Legacy systems for Bill of Materials. Customer was challenged with data inconsistency between these systems due to manual data entry process and human errors resulting major delays in their product delivery.[/vc_column_text][vc_column_text css=”.vc_custom_1464077638721{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462327879302{padding-bottom: 30px !important;}”]Customer wanted to integrate Microsoft Project Server, SAP and Legacy systems bi-directionally in real-time to ensure data consistency, transparency in information exchange and to avoid delays in the product delivery or ensure On time delivery. [/vc_column_text][vc_column_text css=”.vc_custom_1464077657797{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462327895302{padding-bottom: 30px !important;}”]This automated process helped to add any missing, required sequences, not included in the project server project plan. It will also manage the existing human process in detecting and communicating any rule violations concerning dates, missing information or schedule changes on already scheduled orders in SAP. In response to the impacts on the existing process of Sequence Implementation, this project was undertaken to automate the existing engineering manual methods currently in place.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” background_color=”#ededed” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462328264929{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Business Intelligence Self Service – Hybrid Data Warehouse

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”349923″ img_size=”full” alignment=”center” css=”.vc_custom_1463575374649{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077678632{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462328288576{padding-bottom: 30px !important;}”]The Supply chain analytics team requires key information from the engineering systems to complete their KPI’s. Supply chain will also be looking to bring in key data for the S&OP tool set under the SIOP initiative. The data management team will be collaborating with them to ensure that they have access to the data required as part of the Data Architecture global initiatives.[/vc_column_text][vc_column_text css=”.vc_custom_1464077691395{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462328303148{padding-bottom: 30px !important;}”]Infometry built the solution to create a new operational data store (ODS) to provide the Supply Chain Analytics team with the required information from the multiple systems like Engineering, solution development, operational planning, ordering, quoting for this global initiative. Informatica cloud services helped to synchronize data for real time reporting and represent the point in time based on a daily record.[/vc_column_text][vc_column_text css=”.vc_custom_1464077704910{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462328320486{padding-bottom: 30px !important;}”]This key dataset is to improve Supply Chain efficiency and to ensure that we can provide accurate capacity plans for the plants while ensuring that KPI’s can be collected on the purchasing process. Automating this process will save 1000 hours annual in additional reporting work and that value will increase as S&OP is rolled out globally.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462329754811{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Project Web Access(PWA) – Project Schedule Milestone Reporting

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”349926″ img_size=”full” alignment=”center” css=”.vc_custom_1463575404002{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077722678{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462329770943{padding-bottom: 30px !important;}”]This is an automation and efficiency planned to fully automate a new, existing semi-automated process that is designed to provide PMO and leadership a current status on active Project Plan Milestones.[/vc_column_text][vc_column_text css=”.vc_custom_1464077736269{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462329786199{padding-bottom: 30px !important;}”]In addition to fully automating the data collection, delivery and retention of this reporting process enables enhanced business reporting that are currently limited in the current Excel Spreadsheet output by utilizing the data presentation capabilities of the Tableau BI Reporting tool.[/vc_column_text][vc_column_text css=”.vc_custom_1464077753315{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462329803603{padding-bottom: 10px !important;}”]The report and the data integration layer was heavily used at the level of CEO, PMO and Executive committee to answer questions prior to stakeholder’s meeting.[/vc_column_text][vc_column_text css=”.vc_custom_1465296071492{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • What are the issues with meeting customer and financial commitments?
  • What have we done to mitigate the issues, solution and action plans?
  • What are we doing to provide upsides on other areas of the project ?
  • What have we learned as a project leader?

[/no_unordered_list][/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” background_color=”#ededed” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462330198281{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: PCIP – Post Contract Implementation Performance

[/vc_column_text][vc_column_text css=”.vc_custom_1462324311479{padding-bottom: 30px !important;}”]

Industry: Manufacturing

[/vc_column_text][vc_single_image image=”349929″ img_size=”full” alignment=”center” css=”.vc_custom_1463575436091{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077783088{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462330227315{padding-bottom: 30px !important;}”]The objective of the Post Contract Implementation Performance is to understand the manufacturing process after the contract is signed and the project is going through design to delivery. [/vc_column_text][vc_column_text css=”.vc_custom_1464077797432{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462330250296{padding-bottom: 10px !important;}”]The solution was required to build a powerful visualization for executives to view the on-time delivery metrics for active projects on keys milestones like:[/vc_column_text][vc_column_text css=”.vc_custom_1462330419363{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • Design Freeze
  • Order Entry
  • Received on Site from Manufacturing
  • Received on site – Resale
  • Installation Complete
  • Beneficial Use

[/no_unordered_list][/vc_column_text][vc_column_text css=”.vc_custom_1464077810028{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462330452240{padding-bottom: 10px !important;}”]The report and the data integration layer was heavily used at the level of CEO, PMO and Executive committee to answer questions.[/vc_column_text][vc_column_text css=”.vc_custom_1462330633291{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • How the gates are working through the design to order entry to manufacturing etc.?
  • How much percentage of the project is behind schedule per each week?
  • Why the schedule is late for order entry?
  • How we have delivered the project?
  • How on-time we were in delivery?

[/no_unordered_list][/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462333393950{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Hadoop Implementation for a large online travel industry Customer

[/vc_column_text][vc_column_text css=”.vc_custom_1462333410182{padding-bottom: 30px !important;}”]

Industry: Ecommerce/Travel

[/vc_column_text][vc_single_image image=”349932″ img_size=”full” alignment=”center” css=”.vc_custom_1463575471517{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077832031{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462333424330{padding-bottom: 30px !important;}”]Customer wanted to reduce the total cost ownership for the data warehouse environment by migrating to open source Hadoop Big Data architecture[/vc_column_text][vc_column_text css=”.vc_custom_1464077847521{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1462333437494{padding-bottom: 30px !important;}”]Architected the Hadoop infrastructure, capacity, security and data strategy. Configured Hive, Scoop and other administration tools. Each existing ETL stored procedure touching the raw LZ Omniture tables were converted to a series of Map-Reduce jobs. Raw Omniture files were stored in Hadoop HDFS in a directory structure organized by TPID, Year, Month and Day – to enable processing / re-processing for different periods of time. Output of Omniture ETL Workflows were exported from Hadoop back into existing DB2 ADS and DataMarts .An export utility was built for R1.0 to bring selective datasets from Hadoop into DB2 scratchpad for ad hoc analysis, and for advanced analytics using traditional SQL methods[/vc_column_text][vc_column_text css=”.vc_custom_1464077861734{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462333463053{padding-bottom: 30px !important;}”]Infometry team worked closely with an Online travel industry company in helping them to solve Big Data Analytics problem which involves designing and architecting the Hadoop/MapReduce solution to replace their existing DB2 based data warehouse platform which resulted in the savings of 8.5 million over 2.5 yrs and also enable customer to analyze large volume data leveraging Hive/Hue. [/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” background_color=”#ededed” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462333870547{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Enterprise Data Warehouse, Finance Analytics and Operational Reports

[/vc_column_text][vc_column_text css=”.vc_custom_1462333893557{padding-bottom: 30px !important;}”]

Industry: Bio Technology & Pharmaceutical

[/vc_column_text][vc_single_image image=”349937″ img_size=”full” alignment=”center” css=”.vc_custom_1463575493940{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077878280{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462333909500{padding-bottom: 30px !important;}”]A leading Bio Technology global company was in need of a high-performance, enterprise analytical solution to manage their finance, operations and to track their R&D activities. As part of the current operation majority of the reports were Excel based and the data was manually collected across multiple systems, departments which was highly labor intensive and the existing ERP reports were of limited help to meet their analytical needs. Finance team wanted a single view of OpEx, CAPEx and P&L across the organization including their entities in other parts of the world.[/vc_column_text][vc_column_text css=”.vc_custom_1464077912527{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1465296296212{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • Infometry built a data warehouse solution keeping enterprise global audience in mind to support Finance, HR and Accounting as part of Phase- 1. Complex Summary and detail reports was built with full drill capability to analyze OpEx and CapEx across all the entities, departments and projects.
  • Business Objects XI 3.1 Edge series was used for enterprise reporting and Xcelsius 2008 for the executive Dashboards. Single Sign-on was implemented using Windows-AD and BOBJ Universe user security was implemented at Entity, Department and Project level
  • Microsoft SQL Server 2008 was used as Data Warehouse DB platform and SSIS (Sql Server Integration Services) is used as ETL tool. Infometry developed a ETL framework for rapid ETL
  • Development which includes standard package framework to handle errors, self-documentation and highly configurable parameters. OLAP cubes are built using SSAS (Analysis Services) to perform Adhoc analysis.

[/no_unordered_list][/vc_column_text][vc_column_text css=”.vc_custom_1464077934984{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462334037269{padding-bottom: 10px !important;}”]Customer is now able to see summary level OpEx, CapEx, Accrual reports which are drillable to the sub ledger transaction compare the actual numbers including Headcount with Forecast and PLAN numbers. Customer can perform analysis across ITEMS, VENDORS, TIME, Chart of Accounts, Projects, Departments across US and other parts of the world.[/vc_column_text][vc_column_text css=”.vc_custom_1462334053545{padding-bottom: 30px !important;}”]Dashboards provided consolidated view of the overall spending and business users can now perform adhoc analysis using Business Objects WEBI and Voyager. Customer could completely eliminate the manual process of collecting data across multiple depts and eliminate the excel spreadsheets. This along increased their productivity by 80%.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462337529223{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Mobile Analytics for Service Provider

[/vc_column_text][vc_column_text css=”.vc_custom_1462337545696{padding-bottom: 30px !important;}”]

Industry: Telecom

[/vc_column_text][vc_single_image image=”349938″ img_size=”full” alignment=”center” css=”.vc_custom_1463575514572{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077949369{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462337562989{padding-bottom: 30px !important;}”]Due to phenomenal growth in the usage of smart devices, all carriers are experiencing Data Tsunami, whereby, the data is growing by over 20TB per week. Also, due to location intelligence, real time ad targeting is required. Thus an inexpensive solution was to be architected to avoid performing entire business analytics on Oracle and expensive storage like EMC. Real Time Analytics was also required for location based recommendation[/vc_column_text][vc_column_text css=”.vc_custom_1464077964913{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1465296441324{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • Enterprise Data Warehouse, Clickstream Data Analysis, Real Time Analytics. Architected the data warehouse staging on Hadoop/MapReduce using 100 node inexpensive Linux servers to process 2B transactions per day and 100TB of data.
  • Used Hive to provide batch reports, thereby, reducing load on the Oracle DW production instance.
  • Transformation and Cleaning services were designed to be invoked before the data was stored in Hadoop/MapReduce.
  • The solution helped in integrating data from various sources like Venturi video streaming, Techtronix probes, Call data record, Clickstream data.

[/no_unordered_list][/vc_column_text][vc_column_text css=”.vc_custom_1464077982324{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462337597710{padding-bottom: 30px !important;}”]TCO was reduced by 40% by providing hybrid solution using Hadoop / MapReduce. Java recommendation engine was developed to provide real time recommendation to mobile users providing a new revenue stream.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” background_color=”#ededed” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1462338056449{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: eCommerce Analytics in the Cloud

[/vc_column_text][vc_column_text css=”.vc_custom_1462338075113{padding-bottom: 30px !important;}”]

Industry: eCommerce

[/vc_column_text][vc_single_image image=”349943″ img_size=”full” alignment=”center” css=”.vc_custom_1463575536954{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077996155{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1462338087128{padding-bottom: 30px !important;}”]The eCommerce platform was open to social networking of merchants in the cloud. This resulted in phenomenal increase in the structured and unstructured data to over 10TB per week. Thus an inexpensive solution was to be architected to avoid performing entire business analytics on Oracle and expensive storage like EMC. Unstructured data was also to be mined for business intelligence using SAS.[/vc_column_text][vc_column_text css=”.vc_custom_1464078011293{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1465296557390{padding-bottom: 30px !important;}”][no_unordered_list style=”circle” number_type=”circle_number” animate=”no” font_weight=”normal”]

  • Architected the data warehouse staging on Hadoop/MapReduce using 60 node inexpensive Linux servers to process over 10 TB unstructured data per week.
  • Used Hive to provide batch reports, thereby, reducing load on the Oracle DW production instance.
  • Transformation and Cleaning services were designed to be invoked before the data was stored in Hadoop/MapReduce. The unstructured data source was the merchant discovery and forums and occupied over 90% of Hadoop / MapReduce but the intelligence from it was occupying less than 20% of Oracle DW storage and less than 5% of computing resources.The hybrid solution utilized Oracle DW and Java / XML framework of BIRT Actuate to provide ad-hoc reporting and drill down in analytical cubes.

[/no_unordered_list][/vc_column_text][vc_column_text css=”.vc_custom_1464078022732{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1462338131649{padding-bottom: 30px !important;}”]TCO was reduced by 30% by providing hybrid solution using Hadoop / MapReduce. The unstructured data was mined and only the intelligent data was captured in the Oracle DW. Solution was successfully implemented for half million merchants in the eCommerce Cloud.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1642683791174{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: Information to be shared between API’s

[/vc_column_text][vc_column_text css=”.vc_custom_1642683810404{padding-bottom: 30px !important;}”]

Industry: Ecommerce/File Transfer

[/vc_column_text][vc_single_image image=”357263″ img_size=”full” alignment=”center” css=”.vc_custom_1642685286601{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1464077949369{padding-bottom: 10px !important;}”]

Business Objectives

[/vc_column_text][vc_column_text css=”.vc_custom_1642684124969{padding-bottom: 30px !important;}”]Information plays a major role in eCommerce. In the Ecommerce Platform, we have tons of information to be shared between customers, vendors, goods, and logistics. For example, We need to send order confirmation details, order shipment details, and tracking details to the customer. In case the customer wants to return the order, The Return Order details, Return Confirmation details, and finally, payment details are to be sent. In case it is a B2B, we need to send the stock details to the vendor.

 

The main objective of the file transfer application is to share confidential information from one system to another system in a safe and secure manner.[/vc_column_text][vc_column_text css=”.vc_custom_1464077964913{padding-bottom: 10px !important;}”]

Solution

[/vc_column_text][vc_column_text css=”.vc_custom_1642684262734{padding-bottom: 30px !important;}”]Infometry built the shared inbound file transfer and shared outbound file transfer application in Mulesoft which is used to transfer the information in a safe and secured manner. A shared inbound file transfer application will be used to get the details from the third-party system into Infometry System.

 

A shared outbound file transfer application will be used to transfer the details from Infometry System to the third-party system. To achieve this we have used several connectors(FTP, FTP’s, file, and SFTP) and operations(pgpEncryption, pgpDecryption, File Compressor,.)[/vc_column_text][vc_column_text css=”.vc_custom_1464077982324{padding-bottom: 10px !important;}”]

Success Criteria & Business Value

[/vc_column_text][vc_column_text css=”.vc_custom_1642684172711{padding-bottom: 30px !important;}”]Shared file transfer applications will have all operations in and out. It will be used by other applications to transfer the files. It will reduce our effort and time to transfer all the information.

 

The re-using concept in the integration world will minimize the man cost, it will increase the efficiency, and the post-production support and bug fix will be very easy.[/vc_column_text][/vc_column][/vc_row][vc_row row_type=”row” use_row_as_full_screen_section=”no” type=”grid” text_align=”left” background_color=”#ededed” padding_bottom=”30″ box_shadow_on_row=”no”][vc_column][vc_column_text css=”.vc_custom_1647347722097{padding-top: 30px !important;padding-bottom: 10px !important;}”]

Case Study: cloud-based data analytics

[/vc_column_text][vc_column_text css=”.vc_custom_1647347660153{padding-bottom: 30px !important;}”]

Industry: Marketing Analytics

[/vc_column_text][vc_single_image image=”359093″ img_size=”full” alignment=”center” css=”.vc_custom_1647348220460{padding-bottom: 20px !important;}”][vc_column_text css=”.vc_custom_1647346800944{padding-bottom: 10px !important;}”]

ORGANIZATION

[/vc_column_text][vc_column_text css=”.vc_custom_1647347555672{padding-bottom: 30px !important;}”]Client is a cloud-based data analytics company focusing on operations,security and BI use-cases. It provides log management and analytics services that leverage machine-generated big data to deliver real-time IT insights[/vc_column_text][vc_column_text css=”.vc_custom_1647346840785{padding-bottom: 10px !important;}”]

CHALLANGE

[/vc_column_text][vc_column_text css=”.vc_custom_1647347591264{padding-bottom: 30px !important;}”]Our client has multiple sources from which marketing data is captured. It was difficult to organize the same set of data from all the sources into one space. The immediate challenges included:

Extracting marketing data from one of the sources via API. There is no concept of Pagination to collect all the data in one run. All the data is to be collected internally which goes through several internal loops until which it won’t expose all the records. This was complex to achieve in ETL tools and reduced the overall performance. Also, for one of the subjects, multiple iterations were required.

All these made the process time taking and infeasible. End-to-end automation, use of variables, multiple looping made the standard solution complex. So, error handling was also complicated.[/vc_column_text][vc_column_text css=”.vc_custom_1647346898409{padding-bottom: 10px !important;}”]

SOLUTION

[/vc_column_text][vc_column_text css=”.vc_custom_1647347522457{padding-bottom: 30px !important;}”]The standard solution was the use of POST request in Matillion to get the ‘Access Token’ which was used as endpoint to send the GET request to extract data. As mentioned above, there was no concept of pagination, so the endpoint was required to be looped through using variables so that all the records can be extracted. These made the standard approach infeasible. We used the python script component in Matillion which included the python code having all the loops and variables making the other part of mapping simple. For failure recovery of data, we overlapped the data extraction time period. We used Change Data Capture(CDC) in ETL to filter out the same records which came because of time period overlap. This solution produced the best results and the time and complexity were reduced to negligible.[/vc_column_text][/vc_column][/vc_row]