This project focused on creating data frames, filtering data, grouping data, merging, and displaying data. Furthermore, it also includes creating new columns in which specific conditions can be applied. The data is used to solve business problems within a superstore.
The first problem statement is determining the prizes taken from the Top 5 products from the Mobiles & Tablet Category. Second, the data is processed to fulfill the requirement to check whether there is a decrease in the sales of the Others Category in 2022. The task also requires the display of the top 20 products that have the highest decrease. Third, I utilize the data to process the Customer ID and Registered Data of the consumers who have checked out but have not yet made payment. Fourth, the data is sorted and analyzed to compare the average daily sales on the weekends and those on the weekdays in the time range of 3 months.
1
11
Assignment Learning Objectives:
BSIS 105
Assignment 3Purchasing Example Using SAP ERP
Primary Learning Objectives:
· Experience the steps in a typical purchasing transaction
· See how an ERP system handles a typical purchasing transaction
The objective of this assignment is for you to become familiar with the steps and the documents involved in a typical purchasing transaction and also investigate how the SAP system operates for this type of transaction. We will be using the financial accounting (FI) and the materials management (MM) modules of SAP. We start by creating the master data in the system. We create master data for a new material and a new vendor and then link these together using an information record. After that we run through a transaction in which we purchase the material we just created from the vendor we also just created. As the various steps of the purchase are recorded in SAP, we will ask you to answer some questions about what is occurring.
Keep in mind that this business process is normally done by more than one person in order to properly segregate duties and maintain authorization controls. However, in this exercise you will do all of the steps from your individual SAP logon.
You will perform the following tasks:
Create a material master
Create a vendor master
Create an information record to link the vendor and material
Create a purchase order for the material
Receive the material
Receive the invoice from the vendor
Make payment to the vendor
For all of the following work you will use your own company code. This company code is based on the SAP number assigned to you (see Blackboard grades). Whenever you see the value XX in the assignment you will substitute your assigned SAP number. Be sure to use only your assigned SAP number.
Step 1: Create a Material Master record for a Trading Goods
The material master record contains all the data required to define and manage a material. In SAP this is formally part of the Materials Management (MM) module. However, some important accounting information is also contained within this record. For example, product cost and pricing information and also tax information are contained within the material master record.
The master record consists of individual views and the individual views are presented in the form of tabbed pages. These views are organized on a functional or departmental basis. Each department has its own view that permits easy access and maintenance. In other words, data is integrated from engineering, manufacturing, sales and distribution, purchasing, accounting and other departments. This master data is used as a source of data for purchase order processing throughout the procurement cycle. For simplicity, we are ordering a trading good that we will subsequently sell. Trading goods are items that we purchase for resale. There are other types of goods such as raw materials that are used to manufacture finished products. The .
This final project presents my analysis of sales and methods of payment for electronics, fashions, entertainment, and other products offered by a superstore. I used many SQL commands including Create Table, Select, From, Where, Group By, Order By, Limit, Left Join, and Extract.
Contents
Phase 1: Design Concepts 2
Project Description 2
Use Cases 3
Data Dictionary 4
High Level Design Components 5
Detailed Design: Checkout 7
Diagrams 7
Design Analysis 8
Detailed Design: Product Research 9
Diagrams 9
Design – Using Pseudocode 10
Product Profit 10
Phase 2: Sequential Logic Structures 11
Design 11
Product Profit 11
Phase 3: Problem Solving with Decisions 12
Safe Discount 12
Return Customer Bonus 13
Applying Discounts 14
Phase 4: Problem Solving with Loops 15
Total order 15
Problems to Solve 16
Calculate Profits 16
Rock, Paper, Scissors 18
Number Guessing Game 20
Phase 5: Using Abstractions in Design 22
Seeing Abstractions 22
Refactoring 22
Phase 1: Design ConceptsProject Description
Although we may be late to the game, we will nevertheless join the world of e-commerce to sell our fantastic product on the Internet. To do so, we need a Web site that will allow for commerce and sales. To be quick about it, we require the following:
· Searchable inventory and shopping pages
· A shopping cart
· A place for customers to register when they make purchases
· A checkout process to make the purchase
Within this main process, there are a bunch of other needs that must be met, as follows:
· We want to track the date of the last purchase a customer make so we can offer incentives and discounts based on the last time they shopped.
· We will offer sales based on the number of different items that a person purchases.
· We will also give discounts for bulk orders a discount when a person buys many of the same item
In addition to sales feature, the solution must provide the ability to manage and research the sales of products. It must include the following:
· Must be able to add, update and remove product inventory in real time on the site
· Needs to have research capabilities to determine how well a product is selling, such as the following:
· How often the item is viewed, added to shopping carts, and then purchased
· How a price change affects sales and profit
Use Cases
From the description above, we can relate this to the following use cases, which describe how the user will interact with our system. Each use case is a set of screens that the users would interact with to accomplish something they need on the site.
In addition to the customer’s activity, the solution will allow Sales Analysts to manage and research product sales.
Data Dictionary
Variable Name
Type
Description
todaysDate
Date
Today’s date, when the program is running
creationDate
Date
The date the customer created their account
priorPurchases
Integer
Number of Purchases this customer has made in the past
lastPurchaseDate
Date
The date of the last purchase the customer made
lineItemPrice
Array
The price of each line item the customer has added to the cart
lineItemQuantity
Array
The quantity of each line item the customer has added to the cart
membershipLevel
Integer
The account nature of the customer
1 – Guest
2 – Registered
3 – Preferred
totalPurchaseAmount
Double
T.
Brazilian Ecommerce OLIST Market Data Analysiskaushikdey53
Data Synopsis: Brazilian Ecommerce Public Dataset: Retail datasets of 100K
orders placed on Olist spanning between Oct’2016 and Sep’2018 across several
states. Information is tracked with price, order status, payment, freight, and user
review along with many other parameters.
Outcomes: Order Forecasting, Descriptive Analysis & Exploratory Data Analysis
for the data set.
This document provides an overview of creating customer and material masters in SAP SD. It explains how to create a customer master record using transaction code XD01 and populate fields like address, payment details, sales area. It also discusses creating material stock using transaction code MB1C, creating a customer material info record with VD51, and getting a material stock overview with MMBE. The document concludes by explaining how to create a material master for the sales view using transaction code MM01.
Ecommerce Market Mix Modeling using Linear RegressionAchal Kagwad
As a data scientist working for ElecKart(Canada), we need to develop a "market mix model" based on the given info and data sets related to consumer purchases, monthly spends on advertising channels, climatic information and the NPS/stock index.
We perform the process of EDA, Feature Engineering, Linear Regression Models (Additive and Multiplicative), Model Selection and Evaluation.
Data Warehousing and Business Intelligence is one of the hottest skills today, and is the cornerstone for reporting, data science, and analytics. This course teaches the fundamentals with examples plus a project to fully illustrate the concepts.
This document provides steps for updating ADS (Analytical Data Store) and KXEN models. It involves checking data availability, running various ADS projects to populate tables, performing sanity checks on table counts, and applying KXEN models to score different customer segments. The key steps are: 1) Check source data and run preliminary ADS, 2) Populate base tables and run additional ADS in sequence, 3) Perform sanity checks on table counts, 4) Apply KXEN models to score segments, changing settings for each segment.
1
11
Assignment Learning Objectives:
BSIS 105
Assignment 3Purchasing Example Using SAP ERP
Primary Learning Objectives:
· Experience the steps in a typical purchasing transaction
· See how an ERP system handles a typical purchasing transaction
The objective of this assignment is for you to become familiar with the steps and the documents involved in a typical purchasing transaction and also investigate how the SAP system operates for this type of transaction. We will be using the financial accounting (FI) and the materials management (MM) modules of SAP. We start by creating the master data in the system. We create master data for a new material and a new vendor and then link these together using an information record. After that we run through a transaction in which we purchase the material we just created from the vendor we also just created. As the various steps of the purchase are recorded in SAP, we will ask you to answer some questions about what is occurring.
Keep in mind that this business process is normally done by more than one person in order to properly segregate duties and maintain authorization controls. However, in this exercise you will do all of the steps from your individual SAP logon.
You will perform the following tasks:
Create a material master
Create a vendor master
Create an information record to link the vendor and material
Create a purchase order for the material
Receive the material
Receive the invoice from the vendor
Make payment to the vendor
For all of the following work you will use your own company code. This company code is based on the SAP number assigned to you (see Blackboard grades). Whenever you see the value XX in the assignment you will substitute your assigned SAP number. Be sure to use only your assigned SAP number.
Step 1: Create a Material Master record for a Trading Goods
The material master record contains all the data required to define and manage a material. In SAP this is formally part of the Materials Management (MM) module. However, some important accounting information is also contained within this record. For example, product cost and pricing information and also tax information are contained within the material master record.
The master record consists of individual views and the individual views are presented in the form of tabbed pages. These views are organized on a functional or departmental basis. Each department has its own view that permits easy access and maintenance. In other words, data is integrated from engineering, manufacturing, sales and distribution, purchasing, accounting and other departments. This master data is used as a source of data for purchase order processing throughout the procurement cycle. For simplicity, we are ordering a trading good that we will subsequently sell. Trading goods are items that we purchase for resale. There are other types of goods such as raw materials that are used to manufacture finished products. The .
This final project presents my analysis of sales and methods of payment for electronics, fashions, entertainment, and other products offered by a superstore. I used many SQL commands including Create Table, Select, From, Where, Group By, Order By, Limit, Left Join, and Extract.
Contents
Phase 1: Design Concepts 2
Project Description 2
Use Cases 3
Data Dictionary 4
High Level Design Components 5
Detailed Design: Checkout 7
Diagrams 7
Design Analysis 8
Detailed Design: Product Research 9
Diagrams 9
Design – Using Pseudocode 10
Product Profit 10
Phase 2: Sequential Logic Structures 11
Design 11
Product Profit 11
Phase 3: Problem Solving with Decisions 12
Safe Discount 12
Return Customer Bonus 13
Applying Discounts 14
Phase 4: Problem Solving with Loops 15
Total order 15
Problems to Solve 16
Calculate Profits 16
Rock, Paper, Scissors 18
Number Guessing Game 20
Phase 5: Using Abstractions in Design 22
Seeing Abstractions 22
Refactoring 22
Phase 1: Design ConceptsProject Description
Although we may be late to the game, we will nevertheless join the world of e-commerce to sell our fantastic product on the Internet. To do so, we need a Web site that will allow for commerce and sales. To be quick about it, we require the following:
· Searchable inventory and shopping pages
· A shopping cart
· A place for customers to register when they make purchases
· A checkout process to make the purchase
Within this main process, there are a bunch of other needs that must be met, as follows:
· We want to track the date of the last purchase a customer make so we can offer incentives and discounts based on the last time they shopped.
· We will offer sales based on the number of different items that a person purchases.
· We will also give discounts for bulk orders a discount when a person buys many of the same item
In addition to sales feature, the solution must provide the ability to manage and research the sales of products. It must include the following:
· Must be able to add, update and remove product inventory in real time on the site
· Needs to have research capabilities to determine how well a product is selling, such as the following:
· How often the item is viewed, added to shopping carts, and then purchased
· How a price change affects sales and profit
Use Cases
From the description above, we can relate this to the following use cases, which describe how the user will interact with our system. Each use case is a set of screens that the users would interact with to accomplish something they need on the site.
In addition to the customer’s activity, the solution will allow Sales Analysts to manage and research product sales.
Data Dictionary
Variable Name
Type
Description
todaysDate
Date
Today’s date, when the program is running
creationDate
Date
The date the customer created their account
priorPurchases
Integer
Number of Purchases this customer has made in the past
lastPurchaseDate
Date
The date of the last purchase the customer made
lineItemPrice
Array
The price of each line item the customer has added to the cart
lineItemQuantity
Array
The quantity of each line item the customer has added to the cart
membershipLevel
Integer
The account nature of the customer
1 – Guest
2 – Registered
3 – Preferred
totalPurchaseAmount
Double
T.
Brazilian Ecommerce OLIST Market Data Analysiskaushikdey53
Data Synopsis: Brazilian Ecommerce Public Dataset: Retail datasets of 100K
orders placed on Olist spanning between Oct’2016 and Sep’2018 across several
states. Information is tracked with price, order status, payment, freight, and user
review along with many other parameters.
Outcomes: Order Forecasting, Descriptive Analysis & Exploratory Data Analysis
for the data set.
This document provides an overview of creating customer and material masters in SAP SD. It explains how to create a customer master record using transaction code XD01 and populate fields like address, payment details, sales area. It also discusses creating material stock using transaction code MB1C, creating a customer material info record with VD51, and getting a material stock overview with MMBE. The document concludes by explaining how to create a material master for the sales view using transaction code MM01.
Ecommerce Market Mix Modeling using Linear RegressionAchal Kagwad
As a data scientist working for ElecKart(Canada), we need to develop a "market mix model" based on the given info and data sets related to consumer purchases, monthly spends on advertising channels, climatic information and the NPS/stock index.
We perform the process of EDA, Feature Engineering, Linear Regression Models (Additive and Multiplicative), Model Selection and Evaluation.
Data Warehousing and Business Intelligence is one of the hottest skills today, and is the cornerstone for reporting, data science, and analytics. This course teaches the fundamentals with examples plus a project to fully illustrate the concepts.
This document provides steps for updating ADS (Analytical Data Store) and KXEN models. It involves checking data availability, running various ADS projects to populate tables, performing sanity checks on table counts, and applying KXEN models to score different customer segments. The key steps are: 1) Check source data and run preliminary ADS, 2) Populate base tables and run additional ADS in sequence, 3) Perform sanity checks on table counts, 4) Apply KXEN models to score segments, changing settings for each segment.
• Developed and Analysed Data warehouse Using SSIS ETL tool, SSDT, SQL server
• Provided Analysed Quarterly Report Using SSRS of Total sales, Total Revenue, Predicted Future sales, topmost selling products, top discounted product.
• Used Performance tuning to fetch rows faster from database and performed data visualization using R-studio and Neo-4j.
Quick iteration and reusability of metric calculations for powerful data exploration.
At Looker, we want to make it easier for data analysts to service the needs of the data-hungry users in their organizations. We believe too much of their time is spent responding to ad hoc data requests and not enough time is spent building, experimenting, and embellishing a robust model of the business. Worse yet, business users are starving for data, but are forced to make important decisions without access to data that could guide them in the right direction. Looker addresses both of these problems with a YAML-based modeling language called LookML.
This paper walks through a number of data modeling examples, demonstrating how to use LookML to generate, alter, and update reports—without the need to rewrite any SQL. With LookML, you build your business logic, defining your important metrics once and then reusing them throughout a model—allowing quick, rapid iteration of data exploration, while also ensuring the accuracy of the SQL that’s generated. Small updates are quick and can be made immediately available to business users to manipulate, iterate, and transform in any way they see fit.
The document provides step-by-step instructions for customizing the check printing report in Oracle R12. It discusses developing customized templates, modifying code to include additional data, and setting up payment profiles and formats to display data using the customized templates. Key steps include: 1) Developing customized templates; 2) Adding code to retrieve additional data; 3) Creating template definitions, payment formats, documents, and profiles linked to the customized templates. This allows payments to be generated using the customized templates and layouts while retaining the option to use the standard templates.
This research is to find out whether promotional activities give better results than no promotional activities and how much it effects to purchase probability.
Copy Controls are programs in SAP SD that control how data is copied from one document to another when creating a new document based on an existing document. They consist of routines that determine which data fields are copied over. The standard system includes many routines to copy data between documents like sales documents, deliveries, and invoices. Additional routines can be created to meet other business needs. Copy Controls are configured using transaction codes that begin with "VT" and indicate the source and target documents. The controls determine what data is copied at the header, item, and schedule line levels and can be customized as needed for the business process.
SD and finance modules are integrated in SAP to automatically generate accounting documents for sales activities. When goods are dispatched, a material document is created which triggers a finance document to be generated, with the GL account and amount coming from OBYC settings. Similarly, when a billing document is released, the pricing procedure uses information like order type, customer, and sales area to select the appropriate procedure and determine the GL accounts and amounts to post to finance. This process of automatic accounting document generation from sales documents is known as SD-FI integration in SAP.
This document discusses organizational elements, master data, and transactions in SAP systems. It provides examples of organizational elements like client, company code, plant, and describes how master data like customer, material, and employee records are organized and related to organizational elements. It also defines transactions as application programs that execute business processes in SAP.
The document provides a case study about Litware, Inc., an online retailer that uses Microsoft Power BI. It outlines their existing sales data environment and reporting requirements. Key points include:
- Litware stores sales data in SQL Server and wants to connect additional data sources to Power BI.
- Users must be able to filter reports by month and ship month independently.
- Various departments have different visualization and access requirements around sales, returns, and targets data.
This document provides an in-depth reference on set analysis in QlikView. It begins by acknowledging that set analysis is a difficult subject, even for experienced users. The document then covers key aspects of set analysis syntax, including identifiers, operators, modifiers, and element lists. It provides many examples of how to use set analysis to filter charts and calculations based on specific field selections. The goal is to serve as a complete cheat sheet for set analysis in QlikView, with definitions, examples, and tips for effectively using this complex topic.
This document provides templates and guidelines for collecting and analyzing key performance indicator (KPI) data from various departments within a company. It includes templates for comparing sales forecasts to actual orders, production plans to orders and production output, order fulfillment to shipments, and other metrics. Guidelines are provided for completing the templates accurately and for calculating KPIs from the compiled data to measure areas like forecast accuracy, production performance, and delivery timeliness.
Dynamics gp insights to distribution - inventorySteve Chapman
Dynamics GP includes integrated distribution functionality that makes it easy to control inventory and efficiently process purchases and customer orders.
The document discusses tracing a collective purchase requisition (PR) generated for a material with MRP type "PS" (collective requirements) that is used in multiple sales orders. There is no direct way to trace which PR is assigned to which sales order. The solution is to check the dependent requirement number from transaction MD04 for the material, then check the planned order in MD13 and view the assignment tab to see the associated sales order details. A z-report can also be created to fetch the material number, sales document number, purchase requisition number, and planned order number to link these objects.
The document discusses customer master data in SAP. It explains that customer master data contains key information about customers like addresses, payment terms, and delivery methods. It also describes the different account groups (such as sold-to party, bill-to party) and partner functions used to classify customer master records based on the business relationship. Steps are provided on how to create a new customer master record including entering required fields in the general, company code, and sales area data sections.
The document discusses setting up forward and reverse pricing scenarios in SAP. It provides details on:
1) The sender's client needs both forward and reverse pricing scenarios, and the forward scenario is working but they are stuck on replicating it in reverse.
2) An overview of the standard forward pricing procedure including condition types and pricing sequence.
3) The request is to build a reverse procedure where the user enters the total price of Rs. 106.20 and the system calculates the base price, VAT, and service tax values.
4) Suggestions are requested on how to set up the reverse pricing scenario.
Condition technique is a configuration technique in SAP used to configure complex business rules, such as pricing. It consists of several key components, including a field catalog, condition tables, an access sequence, condition types, pricing procedures, and pricing procedure determination. Condition tables contain business rules and are accessed in the order specified by the access sequence. Condition types represent logical components like taxes or discounts. Pricing procedures combine condition types and are assigned to documents like sales orders. Overall, condition technique provides a rules engine for flexibly configuring diverse and changing business rules through its various components.
Real -time data visualization using business intelligence techniques. and mak...MD Owes Quruny Shubho
Real-time data visualization using business intelligence techniques. and make a faster decision on sales data.
Business Intelligence is a way of gaining advantage form business using data. This data can be User information, Stock information sales report or any source that related to its business. From a large amount of data, business intelligence mining the information and convert them to knowledge which plays a role for the decision support system.BI is a mass effective way to make a data-driven decision.BI Visualize data and give us a visual look of data that can be easily understood.
Level of Detail (LOD) Expressions allow users to specify the level of aggregation for calculations independently of the dimensions shown in the visualization. This overcomes prior limitations and allows users to answer questions involving multiple levels of granularity. There are three types of LOD expressions: INCLUDE to calculate at a lower level of detail, EXCLUDE to calculate at a higher level, and FIXED to specify an exact level of detail. LOD expressions provide an elegant way to gain insights from data and answer complex questions with a single calculation.
1. Account determination is the integration between SD and FICO modules that automatically posts prices, discounts, freight, and taxes to the appropriate GL accounts through account keys.
2. Account keys are defined in FICO and then assigned to condition types in pricing procedures in SD. Customer and material groups are used to group customers and materials for pricing purposes.
3. Account determination must be configured correctly for the system to generate accounting documents when invoices are saved, including maintaining account assignment groups in customer and material masters.
Here are the steps to create the RFQ:
1. Enter your purchase requisition number
2. Select all items
3. Click "Adopt" to copy item details to RFQ
4. Click "Save" to save the RFQ
This will create the RFQ with the item details copied from the purchase requisition. Now you can generate quotations by adding vendors.
The document discusses sales prediction for Big Mart stores. It outlines exploring store and product level hypotheses from sales data, data exploration including feature summaries and missing value imputation, feature engineering such as combining variables and imputing outliers, building linear regression models to predict future sales, and exporting cleaned data and models. The goal is to help Big Mart predict sales volumes to aid planning, inventory management, and remaining competitive.
This dashboard aims to evaluate the monthly sales achievement per month for mobiles and tablets, computing, and appliances in a superstore. The tool that is used in this project is Looker Studio.
This task presents SQL basic commands which I used to create a new table with its data. I queried the sales data of furniture, office supplies, and technology.
More Related Content
Similar to Final Project Python - Elyada Wigati Pramaresti.pptx
• Developed and Analysed Data warehouse Using SSIS ETL tool, SSDT, SQL server
• Provided Analysed Quarterly Report Using SSRS of Total sales, Total Revenue, Predicted Future sales, topmost selling products, top discounted product.
• Used Performance tuning to fetch rows faster from database and performed data visualization using R-studio and Neo-4j.
Quick iteration and reusability of metric calculations for powerful data exploration.
At Looker, we want to make it easier for data analysts to service the needs of the data-hungry users in their organizations. We believe too much of their time is spent responding to ad hoc data requests and not enough time is spent building, experimenting, and embellishing a robust model of the business. Worse yet, business users are starving for data, but are forced to make important decisions without access to data that could guide them in the right direction. Looker addresses both of these problems with a YAML-based modeling language called LookML.
This paper walks through a number of data modeling examples, demonstrating how to use LookML to generate, alter, and update reports—without the need to rewrite any SQL. With LookML, you build your business logic, defining your important metrics once and then reusing them throughout a model—allowing quick, rapid iteration of data exploration, while also ensuring the accuracy of the SQL that’s generated. Small updates are quick and can be made immediately available to business users to manipulate, iterate, and transform in any way they see fit.
The document provides step-by-step instructions for customizing the check printing report in Oracle R12. It discusses developing customized templates, modifying code to include additional data, and setting up payment profiles and formats to display data using the customized templates. Key steps include: 1) Developing customized templates; 2) Adding code to retrieve additional data; 3) Creating template definitions, payment formats, documents, and profiles linked to the customized templates. This allows payments to be generated using the customized templates and layouts while retaining the option to use the standard templates.
This research is to find out whether promotional activities give better results than no promotional activities and how much it effects to purchase probability.
Copy Controls are programs in SAP SD that control how data is copied from one document to another when creating a new document based on an existing document. They consist of routines that determine which data fields are copied over. The standard system includes many routines to copy data between documents like sales documents, deliveries, and invoices. Additional routines can be created to meet other business needs. Copy Controls are configured using transaction codes that begin with "VT" and indicate the source and target documents. The controls determine what data is copied at the header, item, and schedule line levels and can be customized as needed for the business process.
SD and finance modules are integrated in SAP to automatically generate accounting documents for sales activities. When goods are dispatched, a material document is created which triggers a finance document to be generated, with the GL account and amount coming from OBYC settings. Similarly, when a billing document is released, the pricing procedure uses information like order type, customer, and sales area to select the appropriate procedure and determine the GL accounts and amounts to post to finance. This process of automatic accounting document generation from sales documents is known as SD-FI integration in SAP.
This document discusses organizational elements, master data, and transactions in SAP systems. It provides examples of organizational elements like client, company code, plant, and describes how master data like customer, material, and employee records are organized and related to organizational elements. It also defines transactions as application programs that execute business processes in SAP.
The document provides a case study about Litware, Inc., an online retailer that uses Microsoft Power BI. It outlines their existing sales data environment and reporting requirements. Key points include:
- Litware stores sales data in SQL Server and wants to connect additional data sources to Power BI.
- Users must be able to filter reports by month and ship month independently.
- Various departments have different visualization and access requirements around sales, returns, and targets data.
This document provides an in-depth reference on set analysis in QlikView. It begins by acknowledging that set analysis is a difficult subject, even for experienced users. The document then covers key aspects of set analysis syntax, including identifiers, operators, modifiers, and element lists. It provides many examples of how to use set analysis to filter charts and calculations based on specific field selections. The goal is to serve as a complete cheat sheet for set analysis in QlikView, with definitions, examples, and tips for effectively using this complex topic.
This document provides templates and guidelines for collecting and analyzing key performance indicator (KPI) data from various departments within a company. It includes templates for comparing sales forecasts to actual orders, production plans to orders and production output, order fulfillment to shipments, and other metrics. Guidelines are provided for completing the templates accurately and for calculating KPIs from the compiled data to measure areas like forecast accuracy, production performance, and delivery timeliness.
Dynamics gp insights to distribution - inventorySteve Chapman
Dynamics GP includes integrated distribution functionality that makes it easy to control inventory and efficiently process purchases and customer orders.
The document discusses tracing a collective purchase requisition (PR) generated for a material with MRP type "PS" (collective requirements) that is used in multiple sales orders. There is no direct way to trace which PR is assigned to which sales order. The solution is to check the dependent requirement number from transaction MD04 for the material, then check the planned order in MD13 and view the assignment tab to see the associated sales order details. A z-report can also be created to fetch the material number, sales document number, purchase requisition number, and planned order number to link these objects.
The document discusses customer master data in SAP. It explains that customer master data contains key information about customers like addresses, payment terms, and delivery methods. It also describes the different account groups (such as sold-to party, bill-to party) and partner functions used to classify customer master records based on the business relationship. Steps are provided on how to create a new customer master record including entering required fields in the general, company code, and sales area data sections.
The document discusses setting up forward and reverse pricing scenarios in SAP. It provides details on:
1) The sender's client needs both forward and reverse pricing scenarios, and the forward scenario is working but they are stuck on replicating it in reverse.
2) An overview of the standard forward pricing procedure including condition types and pricing sequence.
3) The request is to build a reverse procedure where the user enters the total price of Rs. 106.20 and the system calculates the base price, VAT, and service tax values.
4) Suggestions are requested on how to set up the reverse pricing scenario.
Condition technique is a configuration technique in SAP used to configure complex business rules, such as pricing. It consists of several key components, including a field catalog, condition tables, an access sequence, condition types, pricing procedures, and pricing procedure determination. Condition tables contain business rules and are accessed in the order specified by the access sequence. Condition types represent logical components like taxes or discounts. Pricing procedures combine condition types and are assigned to documents like sales orders. Overall, condition technique provides a rules engine for flexibly configuring diverse and changing business rules through its various components.
Real -time data visualization using business intelligence techniques. and mak...MD Owes Quruny Shubho
Real-time data visualization using business intelligence techniques. and make a faster decision on sales data.
Business Intelligence is a way of gaining advantage form business using data. This data can be User information, Stock information sales report or any source that related to its business. From a large amount of data, business intelligence mining the information and convert them to knowledge which plays a role for the decision support system.BI is a mass effective way to make a data-driven decision.BI Visualize data and give us a visual look of data that can be easily understood.
Level of Detail (LOD) Expressions allow users to specify the level of aggregation for calculations independently of the dimensions shown in the visualization. This overcomes prior limitations and allows users to answer questions involving multiple levels of granularity. There are three types of LOD expressions: INCLUDE to calculate at a lower level of detail, EXCLUDE to calculate at a higher level, and FIXED to specify an exact level of detail. LOD expressions provide an elegant way to gain insights from data and answer complex questions with a single calculation.
1. Account determination is the integration between SD and FICO modules that automatically posts prices, discounts, freight, and taxes to the appropriate GL accounts through account keys.
2. Account keys are defined in FICO and then assigned to condition types in pricing procedures in SD. Customer and material groups are used to group customers and materials for pricing purposes.
3. Account determination must be configured correctly for the system to generate accounting documents when invoices are saved, including maintaining account assignment groups in customer and material masters.
Here are the steps to create the RFQ:
1. Enter your purchase requisition number
2. Select all items
3. Click "Adopt" to copy item details to RFQ
4. Click "Save" to save the RFQ
This will create the RFQ with the item details copied from the purchase requisition. Now you can generate quotations by adding vendors.
The document discusses sales prediction for Big Mart stores. It outlines exploring store and product level hypotheses from sales data, data exploration including feature summaries and missing value imputation, feature engineering such as combining variables and imputing outliers, building linear regression models to predict future sales, and exporting cleaned data and models. The goal is to help Big Mart predict sales volumes to aid planning, inventory management, and remaining competitive.
Similar to Final Project Python - Elyada Wigati Pramaresti.pptx (20)
This dashboard aims to evaluate the monthly sales achievement per month for mobiles and tablets, computing, and appliances in a superstore. The tool that is used in this project is Looker Studio.
This task presents SQL basic commands which I used to create a new table with its data. I queried the sales data of furniture, office supplies, and technology.
Through my task, I learned about how to work with the Google Sheet. This task covers data extraction, number formatting, conditional formatting, how to remove duplicate data, and data validation. The data presents the sales and consumer segment of office supplies, furniture, and technology in the United States.
This work explains the Basic Statistics for Data Analysis which includes the type of data, measure of centric (mean, median, etc.), measure of distribution (variance, deviation standard), quartile, percentile, and outliers. In this task, I used statistics to analyze voucher redeems, the service-level agreements, and compare payment with living costs.
Improvement as Data Analyst presents business problems, different problem-solving tools (5 Why, Action Priority Chart, Fishbone, and Flow Mapping), and data analysis process.
First Session - Kickstart Career as Data Analyst presents the definition of data, 5 parameters of big data, why many companies today need data, and different data-related jobs including data engineer, data analyst, and data scientist.
Hello everyone! This is my Excel Portfolio which covers all of my tasks during my intensive bootcamp. Through this bootcamp, I learned about Basic Formula and Functions, Data Cleaning, Data Validation, Conditional Formatting, Data Visualization, VLookup and Match, Pivot, Dashboard Reporting, and Macro VBA. This portfolio presents my analysis of sales, revenue, profit, and popular marketplace of office supplies and furniture. Hopefully, this will help me to open my career path.
The Ipsos - AI - Monitor 2024 Report.pdfSocial Samosa
According to Ipsos AI Monitor's 2024 report, 65% Indians said that products and services using AI have profoundly changed their daily life in the past 3-5 years.
Natural Language Processing (NLP), RAG and its applications .pptxfkyes25
1. In the realm of Natural Language Processing (NLP), knowledge-intensive tasks such as question answering, fact verification, and open-domain dialogue generation require the integration of vast and up-to-date information. Traditional neural models, though powerful, struggle with encoding all necessary knowledge within their parameters, leading to limitations in generalization and scalability. The paper "Retrieval-Augmented Generation for Knowledge-Intensive NLP Tasks" introduces RAG (Retrieval-Augmented Generation), a novel framework that synergizes retrieval mechanisms with generative models, enhancing performance by dynamically incorporating external knowledge during inference.
Predictably Improve Your B2B Tech Company's Performance by Leveraging DataKiwi Creative
Harness the power of AI-backed reports, benchmarking and data analysis to predict trends and detect anomalies in your marketing efforts.
Peter Caputa, CEO at Databox, reveals how you can discover the strategies and tools to increase your growth rate (and margins!).
From metrics to track to data habits to pick up, enhance your reporting for powerful insights to improve your B2B tech company's marketing.
- - -
This is the webinar recording from the June 2024 HubSpot User Group (HUG) for B2B Technology USA.
Watch the video recording at https://youtu.be/5vjwGfPN9lw
Sign up for future HUG events at https://events.hubspot.com/b2b-technology-usa/
06-04-2024 - NYC Tech Week - Discussion on Vector Databases, Unstructured Data and AI
Round table discussion of vector databases, unstructured data, ai, big data, real-time, robots and Milvus.
A lively discussion with NJ Gen AI Meetup Lead, Prasad and Procure.FYI's Co-Found
STATATHON: Unleashing the Power of Statistics in a 48-Hour Knowledge Extravag...sameer shah
"Join us for STATATHON, a dynamic 2-day event dedicated to exploring statistical knowledge and its real-world applications. From theory to practice, participants engage in intensive learning sessions, workshops, and challenges, fostering a deeper understanding of statistical methodologies and their significance in various fields."
Learn SQL from basic queries to Advance queriesmanishkhaire30
Dive into the world of data analysis with our comprehensive guide on mastering SQL! This presentation offers a practical approach to learning SQL, focusing on real-world applications and hands-on practice. Whether you're a beginner or looking to sharpen your skills, this guide provides the tools you need to extract, analyze, and interpret data effectively.
Key Highlights:
Foundations of SQL: Understand the basics of SQL, including data retrieval, filtering, and aggregation.
Advanced Queries: Learn to craft complex queries to uncover deep insights from your data.
Data Trends and Patterns: Discover how to identify and interpret trends and patterns in your datasets.
Practical Examples: Follow step-by-step examples to apply SQL techniques in real-world scenarios.
Actionable Insights: Gain the skills to derive actionable insights that drive informed decision-making.
Join us on this journey to enhance your data analysis capabilities and unlock the full potential of SQL. Perfect for data enthusiasts, analysts, and anyone eager to harness the power of data!
#DataAnalysis #SQL #LearningSQL #DataInsights #DataScience #Analytics
Beyond the Basics of A/B Tests: Highly Innovative Experimentation Tactics You...Aggregage
This webinar will explore cutting-edge, less familiar but powerful experimentation methodologies which address well-known limitations of standard A/B Testing. Designed for data and product leaders, this session aims to inspire the embrace of innovative approaches and provide insights into the frontiers of experimentation!
3. About the Dataset
Data analyzed in this task are collected from Tokopedia (not the original data). The dataset is:
Variable Data Type Description
id object the unique number of order/id_order
customer_id object the unique number of customer
order_date object date when the transaction is carried out
sku_id object the unique number of a product (sku is stock keeping unit)
price int64 the amount of money given in payment for something
qty_ordered int64 the number of items purchased by customers
before_discount float64 the total price value of products
discount_amount float64 the discount value of the total product
after_discount float64 the value of the total price after aggregated by the discount
is_gross int64 shows that customers have not yet paid the orders
is_valid int64 shows that customers have paid the orders
is_net int64 shows that the transaction is finished
payment_id int64 the unique number of payment method
Variable Data Type Description
id object the unique number of a product (it can be used as a key for
joining)
sku_name object the name of the product
base_price float64 the price that is shown in the tagging
cogs int64 cost of selling one product
category object product category
order_detail:
sku_detail:
Variable Data Type Description
id object the unique number of a customer
registered_date object the date when a customer sign up as a member
customer_detail:
payment_detail:
Variable Data Type Description
id int64 the unique number of a payment
payment_method object the method of payment applied during transaction
9. Pre-Processing the Data
We have 4 sets of tables. However, we need to join them by implementing SQL in Colab
so that we can carry out further analysis. In this case, the LEFT JOIN function is used.
1. Implement SQL in Google Colab 2. Write the SQL queries for combining 4 tables
11. Pre-Processing the Data
Checking the data type from each column. This is carried out to check whether the data type is correct or not.
12. Pre-Processing the Data
The types of data in some columns need to be changed to make them easier to process. The data in
the number column is changed into integers. The data in the format column is transformed by giving
the new format so that the date is ordered.
Queries
Result
14. Problem Statements
1. Dear Data Analyst,
At the end of this year, the company will provide prizes for the consumers who win
the Year-End Festival Competition. The Marketing Team needs help to determine the
prizes which will be given to the winners. The prizes will be taken from the Top 5
products from Mobiles and Tablets Category in 2022, with the highest order quantity
(valid = 1).
Please help send the data before the end of this month to the Marketing Team. We
appreciate your help. Thank you.
Kindly regards,
Marketing Team
15. Explanation:
We need to filter the data. First of all, we need to filter the category as this column consists of different types of
categories and we only need Mobiles and Tablets. Next, we filter the order date based on transactions in 2022. We
specify the date by writing >= ‘2022-01-01 (this is the date when any transaction begins in 2022) and <= ‘2022-12-31’
(this is the date when any transaction ends in 2022). After we obtain the transaction data, we need to add ‘is_valid’
function to ensure all the data is correct.
Second, we aggregate the data based on product and total quantity. Therefore, we use groupby function as we need to
group the data. Use sku_name to query the name of the product and qty_order to query the quantity order. We need
to calculate the quantity order. Therefore, we use sum function. We also need to reset the header and sort the data in
ascending. Write reset_index and sort_values by “qty_ordered”. Write ascending=False to order the data from the
highest to the lowest.
Third, we display the top 5 products. Therefore, write df_filter.head ()
16. Result
The sales data of Mobiles & Tablets in 2022 shows that IDROID_BALRX7-Gold is the most
popular product as its quantity order reaches 2000. It is followed by RS_Coconut Bites with a
quantity order of 300. There is a large gap in quantity order between IDROID_BALRX7-Gold
and RS_Coconut Bites. The gap may be caused by the better features of IDROID_BALRX7-
Gold and the different marketing strategies for these two products.
Analysis:
17. Problem Statements
2. Dear Data Analyst,
Following up on the meeting with the Warehouse Team and Marketing Team, we found out
that there was plenty of product stock with Others Category available at the end of 2022.
1. We kindly ask for your help to check the sales data for this category in 2021 based on
sales quantity. Our temporary assumption is that there was a decrease in the sales
quantity in 2022 compared to 2021. Please also display the data of the 15 products
from the category.
2. If there is a decrease in the sales quantity of Others Category, we ask for your help to
present the data of the TOP 20 products that have the highest decrease in 2022
compared with 2021. Please send the result no later than 4 days from today We will
use the result for the next meeting. We appreciate your help. Thank you
Kindly regards,
WarehouseTeam
18. 2.1
Explanation:
First of all, we need to filter the transaction data in 2021. Use is_valid function to ensure the data is valid. We specify the date by writing >=
‘2022-01-01 (this is the date when any transaction begins in 2021) and <= ‘2022-12-31’ (this is the date when any transaction ends in 2021).
Second, use groupby function as we need to group the 2021 data. Use category to query the product category and ‘qty_order’ to query the
quantity order. We need to calculate the quantity order. Therefore, we use sum function. We also need to reset the header andsort the data in
ascending. Write reset_index and sort_values by “qty_ordered”. Write ascending=False to order the data from the highest to the lowest.
Apply the same query to filter the 2022 data and create a group for the 2022 data.
19. Explanation:
2.1
After sorting the data and creating a group each for 2021 and 2022, we need to merge two tables by using the merge function
on ‘category’. Subsequently, we renamed the column ‘qty_ordered_x’ became ‘Quantity 2021’ and column ‘qty_ordered_y’
became ‘Quantity 2022’. Next, add a new column by calculating ‘Quantity_2022’ minus ‘Quantity 2021’
20. 2.1
Explanation:
Furthermore, we also need to create a new column in which the condition can be applied. In this case, we use df_merged[] =
df_merged[].apply(condition) function. Then, display the merged data from the highest to the lowest by using ascending. As
we only need 15 data, add head(15).
We also need to create a parameter to determine whether the sales data can be considered as an increase or decrease. Add
the if value function that will process any value that is larger than 0, considered as an increase, and if not, then the value falls
under the decrease category.
21. Result
Analysis:
The categories that show a decrease in sales in
2022 are Others, Soghaat, Men Fashion, and
Beauty & Grooming. Other is the category that
experienced the biggest downturn as its sales
quantity reduced by 147 from 2021 to 2022.
22. 2.2
Explanation:
The query for 2.2 is quite similar to 2.1. However, when filtering the 2021 and 2022 data, add a new column which is the
category. This column is utilized to filter the new table based on Others Category. The symbol shows that the query is still in
the same line.
23. 2.2
Explanation:
After sorting the data and creating a group each for 2021 and 2022, we need to merge two tables by using the merge function
on ‘sku_name’. Subsequently, we renamed the column ‘qty_ordered_x’ became ‘Quantity 2021’ and column ‘qty_ordered_y’
became ‘Quantity 2022’. Next, add a new column by calculating ‘Quantity_2022’ minus ‘Quantity 2021’
24. 2.2
We also need to create a parameter to determine whether the sales data can be considered as an increase or decrease. Add
the if value function that will process any value that is larger than 0, considered as an increase, and if not, then the value falls
under the decrease category.
Explanation:
Create a new column in which the condition can be applied. In this case, we use df_merged[] =
df_merged[].apply(condition) function. Then, display the merged data from the highest to the lowest by using ascending. As
we need 20 data, add head(20).
25. Result
Analysis:
The product from Others Category that has the
biggest decrease is RB-Dettol Germ Busting
Kit_bf. Its downturn reaches as much as 155,
followed by Telemail_MM-DR-HB-L, which
sales quantity dropped by 21. The drop may be
caused by poor marketing. Therefore, the
company should give attention to these two
products to overcome the sales decrease.
26. Problem Statements
3. Dear Data Analyst,
Following the company’s anniversary in the next 2 months, the Digital Marketing Team will
provide promotional information to customers at the end of this month. The customer
criteria that we need are those who have checked out but have not yet made a payment
(is_gross = 1) in 2022. The data we need are the Customer ID and Registered Date.
Please send the data before the end of this month to the Digital Marketing Team. Thank you
for your help.
Kindly Regards,
Digital Marketing Team
27. Explanation:
We need to filter the data. We look for customers who have checked
out but have not yet made a payment. Therefore, we need to filter
the data based on is_gross, is_net, and order_date. We add the
is_valid function to ensure the data is correct. Then display the data
by using df function.
Explanation:
The Digital Marketing Team needs the data
analyst to send the data. To download the
data, we need to import the files from
Google Colab and convert them to csv
format. Then write files. download.
30. Problem Statements
4. Dear Data Analyst,
From October to December 2022, we will carry out campaigns every Saturday and Sunday.
We will assess whether the campaign has an impact on the sales increase (before_discount).
We kindly ask for your help to display the data:
1. The average daily weekend sales (Saturday and Sunday) vs average daily weekday
sales (Monday-Friday) per month. Is there an increase of sales in each month?
2. The average daily weekend sales (Saturday and Sunday) vs average daily weekday
sales (Monday-Friday) for 3 months.
Please send the data no later than next week. We appreciate your help. Thank you
Kindly Regards,
Digital Marketing Team
32. 4.1
Explanation:
We need to create two tables: one that consists of weekend data and another that consists of weekday data. First, filter the weekend
data based on is_valid = 1, month = October, November, and December, day = Saturday and Sunday, and year = 2022. Use the isin
function to check the value inside the data frame and return the data frame.
Second, we group the data based on the month and the sales increase. Use ‘month’ to query the months and “before_discount” to
query the sales increase. As we need to aggregate the average daily sales, we use sum function. We also need to reset the header and
sort the data in ascending. Write reset_index and sort_values by “before_discount”. Write ascending = False to order the data from
the highest to the lowest.
Use the same syntax for the weekday data.
33. 4.1
Explanation:
Merge the data as we need to see the data from two tables. The function that is used in this step is merge.
34. 4.1
Explanation:
We also need to create a parameter to determine whether the sales data can be considered as an increase
or decrease. Add the if value function that will process any value that is larger than 0, considered as an
increase, and if not, then the value falls under the decrease category.
Create a new column in which the condition can be applied. In this case, we use df_merged[] =
df_merged[].apply(condition) function. Then, display the merged data from the highest to the
lowest by using ascending.
35. 4.2
Explanation:
Create a bar chart to ease the users in comparing the sales on the weekends and the weekdays.
Use plot function to make the bar chart.
36. Result
Analysis:
From the processed data, it is
apparent that the average sales
were the highest in October 2022.
Within three months, the average
sales on the weekdays were higher
than the sales on the weekends.
There was a great drop in the
average weekday sales in November.
However, it rocketed again in
December. On the other hand, the
weekend sales continually decrease
from October to December.
Therefore, the company needs to
give more attention to the weekend
sales. There may be a need to
change the campaign strategy to
improve the average sales on the
weekend.
37.
38. Follow me!
Instagram : elyadawigatip
Twitter : @EliNoBishamon
LinkedIn : https://www.linkedin.com/in/elyada-
wigati-pramaresti-1a2387170/
Bootcamp Data Analysis
by @myskill.id