Actively looking for new opportunity in Business Intelligence, Data Integration and Data Warehouse. Hands on experience in data analysis, providing ETL Solutions and building reports, dashboards and framework for business.
Tools/Technologies:
-Databases: SQL Server 2012, Teradata, MySQL.
-Reporting Tools: Pentaho Report Designer, Tableau.
-Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
-ETL Tools: Pentaho PDI, SSIS
-Scripting Languages: Python,UNIX Shell Scripting, Java script.
-Cloud: AWS – S3, RDS, EC2, DMS, Glue , snowflake, Metallion
Microsoft Power Stack 2019 [Power BI, Excel, Azure & Friends]Olivier Travers
Making sense of Microsoft's renewed push in the business intelligence sector with:
- Power BI including Power BI Report Server, Premium, Embedded, APIs...
- Excel
- Flow, PowerApps. SharePoint, Teams
- SQL Server
- Azure IaaS / PaaS / SaaS
[Latest update: mid-2019] I have put an inordinate amount of research time to keep this presentation up to date since its original publication in February 2017. The latest version is available upon request for a reasonable fee and for my consulting clients, don't hesitate to contact me to discuss your project.
Olivier Travers
olivier@needlestacker.com
Highlights from our portfolio of AI projects including marketing, quality, portfolio, financial and insurance analytics solutions demonstrating the versatility and power of domain expertise, AI and Tableau's platform
Power BI has become a product with a ton of exciting features. This presentation will give an overview of some of them, including Power BI Desktop, Power BI service, what’s new, integration with other services, Power BI premium, and administration.
Actively looking for new opportunity in Business Intelligence, Data Integration and Data Warehouse. Hands on experience in data analysis, providing ETL Solutions and building reports, dashboards and framework for business.
Tools/Technologies:
-Databases: SQL Server 2012, Teradata, MySQL.
-Reporting Tools: Pentaho Report Designer, Tableau.
-Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
-ETL Tools: Pentaho PDI, SSIS
-Scripting Languages: Python,UNIX Shell Scripting, Java script.
-Cloud: AWS – S3, RDS, EC2, DMS, Glue , snowflake, Metallion
A competent professional offering over 7+ years of experience in Business Intelligence & Data Warehousing working in roles such as Business Analyst, Data Analyst & Reporting Developer, ETL Architect, Data Architect. Extensively worked on a wide range of tools such as Tableau, Spotfire & SAP Business Objects. Skilled in Teradata database, Oracle Database, MS SQL Server, MS Access, PostgreSQL, Requirements Analysis, reporting, Extract, Transform, Load (ETL), Data Warehousing, End-to-end execution of BI Projects, System Integration, Value Delivery, Team Leadership, Onshore-Offshore working model.
Business Expertise: Power & Engineering(GE POWER), Banking and Financial Services(GE CAPITAL), Leasing domain .
Interests: Data Science.
Microsoft Power Stack 2019 [Power BI, Excel, Azure & Friends]Olivier Travers
Making sense of Microsoft's renewed push in the business intelligence sector with:
- Power BI including Power BI Report Server, Premium, Embedded, APIs...
- Excel
- Flow, PowerApps. SharePoint, Teams
- SQL Server
- Azure IaaS / PaaS / SaaS
[Latest update: mid-2019] I have put an inordinate amount of research time to keep this presentation up to date since its original publication in February 2017. The latest version is available upon request for a reasonable fee and for my consulting clients, don't hesitate to contact me to discuss your project.
Olivier Travers
olivier@needlestacker.com
Highlights from our portfolio of AI projects including marketing, quality, portfolio, financial and insurance analytics solutions demonstrating the versatility and power of domain expertise, AI and Tableau's platform
Power BI has become a product with a ton of exciting features. This presentation will give an overview of some of them, including Power BI Desktop, Power BI service, what’s new, integration with other services, Power BI premium, and administration.
Actively looking for new opportunity in Business Intelligence, Data Integration and Data Warehouse. Hands on experience in data analysis, providing ETL Solutions and building reports, dashboards and framework for business.
Tools/Technologies:
-Databases: SQL Server 2012, Teradata, MySQL.
-Reporting Tools: Pentaho Report Designer, Tableau.
-Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
-ETL Tools: Pentaho PDI, SSIS
-Scripting Languages: Python,UNIX Shell Scripting, Java script.
-Cloud: AWS – S3, RDS, EC2, DMS, Glue , snowflake, Metallion
A competent professional offering over 7+ years of experience in Business Intelligence & Data Warehousing working in roles such as Business Analyst, Data Analyst & Reporting Developer, ETL Architect, Data Architect. Extensively worked on a wide range of tools such as Tableau, Spotfire & SAP Business Objects. Skilled in Teradata database, Oracle Database, MS SQL Server, MS Access, PostgreSQL, Requirements Analysis, reporting, Extract, Transform, Load (ETL), Data Warehousing, End-to-end execution of BI Projects, System Integration, Value Delivery, Team Leadership, Onshore-Offshore working model.
Business Expertise: Power & Engineering(GE POWER), Banking and Financial Services(GE CAPITAL), Leasing domain .
Interests: Data Science.
I am Murali.Below is a short summary on my professional experience,
Total years of experience - 13 (Releavant BO experience - 10 Years)
Technologies/Tools worked on : SAP BO 4.x - XI R2,BO Admin, Xcelsius Dashboard designing, Crystal Reports, SAP Design Studio, SAP BW 7,Unix,Oracle 10G PL/SQL, SQL Server and Microsoft Technologies(VB,ASP and VB.Net).
I am interested to take up new roles on different aspects in BO domain.
The Building Blocks of QuestDB, a Time Series Databasejavier ramirez
Talk Delivered at Valencia Codes Meetup 2024-06.
Traditionally, databases have treated timestamps just as another data type. However, when performing real-time analytics, timestamps should be first class citizens and we need rich time semantics to get the most out of our data. We also need to deal with ever growing datasets while keeping performant, which is as fun as it sounds.
It is no wonder time-series databases are now more popular than ever before. Join me in this session to learn about the internal architecture and building blocks of QuestDB, an open source time-series database designed for speed. We will also review a history of some of the changes we have gone over the past two years to deal with late and unordered data, non-blocking writes, read-replicas, or faster batch ingestion.
Unleashing the Power of Data_ Choosing a Trusted Analytics Platform.pdfEnterprise Wired
In this guide, we'll explore the key considerations and features to look for when choosing a Trusted analytics platform that meets your organization's needs and delivers actionable intelligence you can trust.
Enhanced Enterprise Intelligence with your personal AI Data Copilot.pdfGetInData
Recently we have observed the rise of open-source Large Language Models (LLMs) that are community-driven or developed by the AI market leaders, such as Meta (Llama3), Databricks (DBRX) and Snowflake (Arctic). On the other hand, there is a growth in interest in specialized, carefully fine-tuned yet relatively small models that can efficiently assist programmers in day-to-day tasks. Finally, Retrieval-Augmented Generation (RAG) architectures have gained a lot of traction as the preferred approach for LLMs context and prompt augmentation for building conversational SQL data copilots, code copilots and chatbots.
In this presentation, we will show how we built upon these three concepts a robust Data Copilot that can help to democratize access to company data assets and boost performance of everyone working with data platforms.
Why do we need yet another (open-source ) Copilot?
How can we build one?
Architecture and evaluation
Techniques to optimize the pagerank algorithm usually fall in two categories. One is to try reducing the work per iteration, and the other is to try reducing the number of iterations. These goals are often at odds with one another. Skipping computation on vertices which have already converged has the potential to save iteration time. Skipping in-identical vertices, with the same in-links, helps reduce duplicate computations and thus could help reduce iteration time. Road networks often have chains which can be short-circuited before pagerank computation to improve performance. Final ranks of chain nodes can be easily calculated. This could reduce both the iteration time, and the number of iterations. If a graph has no dangling nodes, pagerank of each strongly connected component can be computed in topological order. This could help reduce the iteration time, no. of iterations, and also enable multi-iteration concurrency in pagerank computation. The combination of all of the above methods is the STICD algorithm. [sticd] For dynamic graphs, unchanged components whose ranks are unaffected can be skipped altogether.
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Discussion on Vector Databases, Unstructured Data and AI
https://www.meetup.com/unstructured-data-meetup-new-york/
This meetup is for people working in unstructured data. Speakers will come present about related topics such as vector databases, LLMs, and managing data at scale. The intended audience of this group includes roles like machine learning engineers, data scientists, data engineers, software engineers, and PMs.This meetup was formerly Milvus Meetup, and is sponsored by Zilliz maintainers of Milvus.
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Round table discussion of vector databases, unstructured data, ai, big data, real-time, robots and Milvus.
A lively discussion with NJ Gen AI Meetup Lead, Prasad and Procure.FYI's Co-Found
1. Resume of Abhishek Jaiswal Senior Data Engineer
Page 1 of 3
Abhishek Jaiswal Mobile: 8971948312
E-mail:abhishek.jaiswal2225@gmail.com
Technical Skills
Platforms: WINDOWS98/2000/2008/XP/8, Linux.
Databases: SQL Server 2014, Teradata, MySQL, Oracle 9i, IBM DB2, Green Plum 4.1.
Reporting Tools: Pentaho Report Designer, Tableau.
Dashboard Tool: Pentaho CDF, Pentaho CDE, Saiku.
ETL Tools: Pentaho PDI, SSIS
Scripting Languages: Python, UNIX Shell Scripting, Java script.
Cloud: AWS – S3, RDS, EC2, DMS, Glue, snowflake.
Professional Achievements and Recognitions
Awarded Rising Star by Cognizant Technology Solution.
Awarded Wow Member by sears for data analysis.
Awarded Best performer by sears for building master customer database for home services.
Certifications
Oracle certified SQL/PL SQL 10g
IBM Certified Solution developer Infosphere Datastage 8.0
Professional Abridgment
More than 8 years of Experience in delivering Business Intelligence, Data Integration and Data Warehouse Solutions.
More than 6 years of Experience in building reports, dashboards and framework for business.
More than 3 years in Designing and developing Data Architecture and BI Solutions.
More than 3 Years of Experience using Agile Development methodology.
End-to-end data warehousing experience across full SDLC from inception to deployment.
Experience in Implementing Data Governance, Master Data Management (MDM) and Metadata Management.
Experience in delivering products using Cloud Based Technology Primarily AWS and Snowflake
Experience in working on various use cases in Data Integration and Business Intelligence using various Commercial and
Open Source tools.
Experience in US Retail and e-commerce
Excellent problem solving, time management and Organizational Skills
Managed projects to ensure that deliverables are met within schedule, budget, & quality.
Demonstrated ability to bridge the gap between IT, Business Users & Senior Management.
Ability to develop & maintain effective relationships with stakeholders & business partners.
Ability to effectively lead, build, delivers, and execute complex global BI initiatives.
Excellent Communication and Interpersonal Skills, Self-Motivated and Strong Team Player.
2. Resume of Abhishek Jaiswal Senior Data Engineer
Page 2 of 3
Project Details
Customer Data Master for Operational and analytics purposes
(sears), July 2017-Present
Data analysis on multiple tables in Teradata for customer information like address, phone and Email.
Data migration from Teradata to SQL Server using SSIS .
Created functions to clean names, address, phone and Email.
Created Store procedure to associate data for same address, phone and Email.
Created Store procedure to refresh data on daily basis with multiple validations.
Worked on AWS Data migration tool to migrate data from one database to another.
Used tools like Jmeter to perform performance testing on the database.
Written optimized query to improve performance.
Created multiple jobs to refresh data on daily and monthly basis.
Created log table which will give all status information about the each day data load.
Tools Used: Teradata, SQLServer, Python, SSIS, AWS data migration tool, jmeter.
Data analysis of sale data and building reports on tableau(affine analytics), Oct 2016– June 2017
Data migration from Teradata to SQL Server.
Created multiple reports for sales data for home services.
Service intelligence tool R&D.
Data analysis on multiple tables for daily, weekly and monthly sales and build reports on tableau.
Tools Used: MySQL, tableau, SQLServer, Business Intelligence, Service intelligence tool.
Multiple Development as a part of JIO Matrix Team ( Reliance Jio) , Aug 2015 – Oct 2016
Designed front end login page for jio matrix users
Created security at the landing page for different role.
Created multiple dashboards for jio matrix team.
Resolved most of the bugs related to pentaho BI.
Done the migration from Pentaho CDE 5.3 to 6.0 successfully.
Handled end to end process like gathering the requirements from the client, understanding the requirement, creating a
process and document it, talking to business analyst, Creating reports and dashboards, Handling issues and giving support
to users.
Tools Used: Pentaho, Reporting, Dashboards, Framework, CDE, SQL, JavaScript, HTML, CSS, MySQL, Business Intelligence, hadoop
hive.
3. Resume of Abhishek Jaiswal Senior Data Engineer
Page 3 of 3
Multiple Development as a part of Analytic Team , Souq.com May 2014 - Jul 2015
Created Generic Framework using CDE where users has to put the query and report will be generated automatically.
Migration of pentaho from 3.8 version to 5.0 version to 5.3.
Created search option in pentaho 5.0, help page, navigation based on the user access.
Customer relation management tool for end user called Query Builder , using this tool end user can create a query using
multiple selection and filter with different container.
Alert framework tool where user can schedule their reports which will run at schedule time and send a mail to specified
users.
Many optimization has been done to improve pentaho performance.
Actively participated as an individual team member in this project, involved in Design and Development of each
framework.
Tools Used: Pentaho CDE 5.0 , MYSQL, SQLyog.
Insight Compass ELM Data mart Migration, Cognizant Technology Solution DEC 2011 – APL 2014
Managing the Team of size 6 by assigning, monitoring daily tasks assigned to them.
To build close and productive relationship with the stakeholders.
Gathering requirements from client based on the request.
Design and review ETL jobs created.
Review of reports developed on BO.
Writing queries to Test ETL jobs/report data.
Performance tuning of Jobs developed in Pentaho PDI.
Tools Used: Business Objects, IBM DB2, Green Plum, Pentaho PDI.
One Cognizant Analytical Reporting Platform, Cognizant Application Services Aug 2011 – DEC 2011.
Closely Interact with client to understand and document requirements
Design and development of ETL jobs using Data Stage.
Performance tuning of all the jobs created.
Negotiating with client for CR, Deadlines etc.
Root-cause analysis for data mismatch and data discrepancy.
Creating Validation Routines for automated job validation.
Prepare and Maintain documents for auditing purpose.
Creating Unix shell scripts for file manipulation, log auditing etc.
ETL Performance tuning to increase efficiency by bringing down the execution time from 13 hours to 2 hours.
Data stage job scheduling.
Tools Used: Data Stage, Business Objects, DB2,Unix
Year Institution Degree/Certificate
2010 Visvesvaraya technological university Bachelor of Engineering (Computer Science)