1. Intelligent Indian Dishes
Recommendation based on Ingredient
Submitted to:- Submitted by:-
Department of Computer Science and Engineering, Ayushi Gupta(1918294)
GEHU Dehradun Nisha Majhi (19012480)
Aditya Pratap Singh (1918155)
Bittu Day (1918302)
B.tech CSE (7th Semester)
2. Objective
The goal of the application system is to provide a platform where
users can try traditional food recipes of different parts of the country
which is best in quality and budget friendly . This is useful for
anyone who want to try new foods, who hesitate in asking to locals.
This web application is targeted towards local audience for now and
as of present it can only be used as a web application which
recommends best Indian food.
3. Motivation
This application can help the users to find their favorite Indian cuisine according to their choice and
ingredient they have . It can help us to bring out the best food places in the country, considering food
quality and service .This way people can explore the country too and remember the place because of
Its culture ,food and its talented people . Also people who are kind of introvert ,this recommendation
system is really beneficial for them , One click and there you go!!. This can resolve the communication
barriers also for people belonging to the different parts of the country.
4. Overview of Object-Oriented Programming
As the name suggests, Object-Oriented Programming or OOPs refers to languages
that use objects in programming. Object-oriented programming aims to implement real-
world entities like inheritance, hiding, polymorphism, etc in programming. The main
aim of OOP is to bind together the data and the functions that operate on them so that no
other part of the code can access this data except that function.
5. OBJECTS
Any entity that has state and behavior is known as an
object. For example, a chair, pen, table, keyboard, bike,
etc. It can be physical or logical.
An Object can be defined as an instance of a class. An
object contains an address and takes up some space in
memory. Objects can communicate without knowing
the details of each other's data or code. The only
necessary thing is the type of message accepted and the
type of response returned by the objects.
Example: A dog is an object because it has states like
color, name, breed, etc. as well as behaviors like
wagging the tail, barking, eating, etc.
6. Hybrid technique
Consider them as variables in a linear combination. This technique gives each of them a weight of a by summing
the weighted results. Assume that there are c recommended approaches combined with the use of a weighted
strategy, a user's predicted score u for an item can be calculated as follows:
where qƒ denotes the weight of algorithm pu,,I Since we are interested in combining the 2 recommendation approaches, we set
c=2. According to [14] if c = 2, the calculation of the prediction score can be written as:
and the optimized weight can be obtained by calculating:
7. Web Scraping
Web scraping is an automated method of extracting large amounts of data from web pages. Most of this data
are unstructured data in HTML format, which are then converted to structured data in a spreadsheet or
database so that it can be used in different applications. There are many different ways to perform web
scraping for extracting data from web pages. These include the use of online services, specific APIs or even
creating code to scrape the site from scratch. Many big websites like Google, Twitter, Facebook,
StackOverflow, etc. have APIs that give you structured access to their data format. This is the best option, but
there are other sites that do not allow users to access large amountsdata in a structured form or are simply not
that technologically advanced. in that situation it is best to use Web Scraping to scrape data from the web.
Web scraping requires two parts, a crawler and a scraper. The browser is an artificial intelligence algorithm
that crawls the web looking for the specific data you want by following links on the Internet. A scraper, on the
other hand, is a specific tool created extract data from the web. The construction of The scraper can vary
considerably according to complexity and scope of the project so that data can be extracted quickly and
accurately.
8. How do web scrapers work?
Web Scrapers can extract all data on specific pages or specific data that the user wants.
Ideally , it's best to enter the data you want the web scraper to extract only that data quickly.
For example, you may want to scrape the Amazon page for the types of juicers available, but
you may only want model data of different juicers and not customer reviews. So when a web
scraper needs to scrape a website, URLs are provided first. Then it loads all of them The
HTML code for these sites and a more advanced scraper can even extract all the CSS and
JavaScript elements too. Then the scraper gets the required data from this HTML code and
outputs this data in the format specified by the user. It is usually in the form of an Excel
spreadsheet or a CSV file, but data can also be saved in other formats, such as a JSON file.