This document provides information on using Perl to interact with and manipulate databases. It discusses:
- Using the DBI module to connect to databases in a vendor-independent way
- Installing Perl modules like DBI and DBD drivers to connect to specific databases like Postgres
- Preparing the Postgres database environment, including initializing and starting the database
- Using the DBI handler and statements to connect to and execute queries on the database
- Retrieving and manipulating database records through functions like SELECT, adding new records, etc.
The document provides code examples for connecting to Postgres with Perl, executing queries to retrieve data, and manipulating the database through operations like inserting new records. It focuses on
r packagesdata analytics study material;
learn data analytics online;
data analytics courses;
courses for data analysis;
courses for data analytics;
online data analysis courses;
courses on data analysis;
data analytics classes;
data analysis training courses online;
courses in data analysis;
data analysis courses online;
data analytics training;
courses for data analyst;
data analysis online course;
data analysis certification;
data analysis courses;
data analysis classes;
online course data analysis;
learn data analysis online;
data analysis training;
python for data analysis course;
learn data analytics;
study data analytics;
how to learn data analytics;
data analysis course free;
statistical methods and data analysis;
big data analytics;
data analysis companies;
python data analysis course;
tools that can be used to analyse data;
data analysis consulting;
basic data analytics;
data analysis programs;
examples of data analysis tools;
big data analysis tools;
data analytics tools and techniques;
statistics for data analytics;
data analytics tools;
data analytics and big data;
data analytics big data;
data analysis software;
data analytics with excel;
website data analysis;
data analytics companies;
data analysis qualifications;
tools for data analytics;
data analysis tools;
qualitative data analysis software;
free data analytics;
data analysis website;
tools for analyzing data;
data analytics software;
free data analysis software;
tools for analysing data;
data mining book;
learn data analysis;
about data analytics;
statistical data analysis software;
it data analytics;
data analytics tutorial for beginners;
unstructured data analytics;
data analytics using excel;
dissertation data analysis;
sample of data analysis;
data analysis online;
data analytics;
tools of data analysis;
analytical tools for data analysis;
statistical tools to analyse data;
data analysis help;
data analysis education;
statistical technique for data analysis;
tools for data analysis;
how to learn data analysis;
data analytics tutorial;
excel data analytics;
data mining course;
data analysis software free;
big data and data analytics;
statistical analysis software;
tools to analyse data;
online data analysis;
data mining software;
data analytics statistics;
how to do data analytics;
statistical data analysis tools;
data analyst tools;
business data analysis;
tools and techniques of data analysis;
education data analysis;
advanced data analytics;
study data analysis;
spreadsheet data analysis;
learn data analysis in excel;
software for data analysis;
shared data warehouse;
what are data analysis tools;
data analytics and statistics;
data analyse;
analysis courses;
data analysis tools for research;
research data analysis tools;
big data analysis;
data mining programs;
applications of data analytics;
data analysis tools and techniques;
data analysis business;
Spencer Christensen
There are many aspects to managing an RDBMS. Some of these are handled by an experienced DBA, but there are a good many things that any sys admin should be able to take care of if they know what to look for.
This presentation will cover basics of managing Postgres, including creating database clusters, overview of configuration, and logging. We will also look at tools to help monitor Postgres and keep an eye on what is going on. Some of the tools we will review are:
* pgtop
* pg_top
* pgfouine
* check_postgres.pl.
Check_postgres.pl is a great tool that can plug into your Nagios or Cacti monitoring systems, giving you even better visibility into your databases.
If you’re already a SQL user then working with Hadoop may be a little easier than you think, thanks to Apache Hive. It provides a mechanism to project structure onto the data in Hadoop and to query that data using a SQL-like language called HiveQL (HQL).
This cheat sheet covers:
-- Query
-- Metadata
-- SQL Compatibility
-- Command Line
-- Hive Shell
This tutorial will guide you through the many considerations when deploying a sharded cluster. We will cover the services that make up a sharded cluster, configuration recommendations for these services, shard key selection, use cases, and how data is managed within a sharded cluster. Maintaining a sharded cluster also has its challenges. We will review these challenges and how you can prevent them with proper design or ways to resolve them if they exist today. Additional topics like tag aware sharding (Zones), disaster recovery, and data streaming will also be covered.
Part 1 of a three part presentation showing how nutch and solr may be used to crawl the web, extract data and prepare it for loading into a data warehouse.
r packagesdata analytics study material;
learn data analytics online;
data analytics courses;
courses for data analysis;
courses for data analytics;
online data analysis courses;
courses on data analysis;
data analytics classes;
data analysis training courses online;
courses in data analysis;
data analysis courses online;
data analytics training;
courses for data analyst;
data analysis online course;
data analysis certification;
data analysis courses;
data analysis classes;
online course data analysis;
learn data analysis online;
data analysis training;
python for data analysis course;
learn data analytics;
study data analytics;
how to learn data analytics;
data analysis course free;
statistical methods and data analysis;
big data analytics;
data analysis companies;
python data analysis course;
tools that can be used to analyse data;
data analysis consulting;
basic data analytics;
data analysis programs;
examples of data analysis tools;
big data analysis tools;
data analytics tools and techniques;
statistics for data analytics;
data analytics tools;
data analytics and big data;
data analytics big data;
data analysis software;
data analytics with excel;
website data analysis;
data analytics companies;
data analysis qualifications;
tools for data analytics;
data analysis tools;
qualitative data analysis software;
free data analytics;
data analysis website;
tools for analyzing data;
data analytics software;
free data analysis software;
tools for analysing data;
data mining book;
learn data analysis;
about data analytics;
statistical data analysis software;
it data analytics;
data analytics tutorial for beginners;
unstructured data analytics;
data analytics using excel;
dissertation data analysis;
sample of data analysis;
data analysis online;
data analytics;
tools of data analysis;
analytical tools for data analysis;
statistical tools to analyse data;
data analysis help;
data analysis education;
statistical technique for data analysis;
tools for data analysis;
how to learn data analysis;
data analytics tutorial;
excel data analytics;
data mining course;
data analysis software free;
big data and data analytics;
statistical analysis software;
tools to analyse data;
online data analysis;
data mining software;
data analytics statistics;
how to do data analytics;
statistical data analysis tools;
data analyst tools;
business data analysis;
tools and techniques of data analysis;
education data analysis;
advanced data analytics;
study data analysis;
spreadsheet data analysis;
learn data analysis in excel;
software for data analysis;
shared data warehouse;
what are data analysis tools;
data analytics and statistics;
data analyse;
analysis courses;
data analysis tools for research;
research data analysis tools;
big data analysis;
data mining programs;
applications of data analytics;
data analysis tools and techniques;
data analysis business;
Spencer Christensen
There are many aspects to managing an RDBMS. Some of these are handled by an experienced DBA, but there are a good many things that any sys admin should be able to take care of if they know what to look for.
This presentation will cover basics of managing Postgres, including creating database clusters, overview of configuration, and logging. We will also look at tools to help monitor Postgres and keep an eye on what is going on. Some of the tools we will review are:
* pgtop
* pg_top
* pgfouine
* check_postgres.pl.
Check_postgres.pl is a great tool that can plug into your Nagios or Cacti monitoring systems, giving you even better visibility into your databases.
If you’re already a SQL user then working with Hadoop may be a little easier than you think, thanks to Apache Hive. It provides a mechanism to project structure onto the data in Hadoop and to query that data using a SQL-like language called HiveQL (HQL).
This cheat sheet covers:
-- Query
-- Metadata
-- SQL Compatibility
-- Command Line
-- Hive Shell
This tutorial will guide you through the many considerations when deploying a sharded cluster. We will cover the services that make up a sharded cluster, configuration recommendations for these services, shard key selection, use cases, and how data is managed within a sharded cluster. Maintaining a sharded cluster also has its challenges. We will review these challenges and how you can prevent them with proper design or ways to resolve them if they exist today. Additional topics like tag aware sharding (Zones), disaster recovery, and data streaming will also be covered.
Part 1 of a three part presentation showing how nutch and solr may be used to crawl the web, extract data and prepare it for loading into a data warehouse.
IBM Cognos Analytics: Empowering business by infusing intelligence across the...IBM Analytics
Analytics leaders can now offer organizations the self-service they want while giving users the control they expect, redefining business intelligence in the process—all thanks to the IBM Cognos Analytics platform. Learn more at http://ibm.co/cognosanalytics.
A Guide to IT Consulting- Business.comBusiness.com
IT consulting is an umbrella term for a variety of interrelated activities. This whitepaper breaks down those activities while introducing the benefits and drawbacks of offshoring IT. Oh, and be sure to follow the three steps to choosing the perfect IT consultant.
Managing Your Security Logs with ElasticsearchVic Hargrave
The ELK stack (Elasticsearch-Logstash-Kibana) provides a cost effective alternative to commercial SIEMs for ingesting and managing OSSEC alert logs. This presentation will show you how to construct a low cost SIEM based on ELK that rivals the capabilties of commercials SIEMs.
Covered:
1. Databases and Schemas
2. Tablespaces
3. Data Type
4. Exploring Databases
5. Locating the database server's message log
6. Locating the database's system identifier
7. Listing databases on this database server
8. How much disk space does a table use?
9. Which are my biggest tables?
10. How many rows are there in a table?
11. Quickly estimating the number of rows in a table
12. Understanding object dependencies
Python Utilities for Managing MySQL DatabasesMats Kindahl
Managing a MySQL database server can become a full time job. What we need are tools that bundle a set of related tasks into a common utility. While there are several such utility libraries to choose, it is often the case that you need to customize them to your needs. The MySQL Utilities library is the answer to that need. It is open source so you can modify and expand it as you see fit.
This is the presentation from OSCON 2011 in Portland.
MySQL has a set of utilities written in Python that can do some amazing things for your MySQL instances from setting up replication with automatic fail over to copying database
BITS: Introduction to MySQL - Introduction and InstallationBITS
BITS: Introduction to relational databases and MySQL - Module 1: Introduction and Installation
See
http://www.bits.vib.be/index.php?option=com_content&view=article&id=17204047:green-basics-of-databases&catid=81:training-pages&Itemid=190
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Software Engineering, Software Consulting, Tech Lead.
Spring Boot, Spring Cloud, Spring Core, Spring JDBC, Spring Security,
Spring Transaction, Spring MVC,
Log4j, REST/SOAP WEB-SERVICES.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
top nidhi software solution freedownloadvrstrong314
This presentation emphasizes the importance of data security and legal compliance for Nidhi companies in India. It highlights how online Nidhi software solutions, like Vector Nidhi Software, offer advanced features tailored to these needs. Key aspects include encryption, access controls, and audit trails to ensure data security. The software complies with regulatory guidelines from the MCA and RBI and adheres to Nidhi Rules, 2014. With customizable, user-friendly interfaces and real-time features, these Nidhi software solutions enhance efficiency, support growth, and provide exceptional member services. The presentation concludes with contact information for further inquiries.
Cyaniclab : Software Development Agency Portfolio.pdfCyanic lab
CyanicLab, an offshore custom software development company based in Sweden,India, Finland, is your go-to partner for startup development and innovative web design solutions. Our expert team specializes in crafting cutting-edge software tailored to meet the unique needs of startups and established enterprises alike. From conceptualization to execution, we offer comprehensive services including web and mobile app development, UI/UX design, and ongoing software maintenance. Ready to elevate your business? Contact CyanicLab today and let us propel your vision to success with our top-notch IT solutions.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Listen to the keynote address and hear about the latest developments from Rachana Ananthakrishnan and Ian Foster who review the updates to the Globus Platform and Service, and the relevance of Globus to the scientific community as an automation platform to accelerate scientific discovery.
May Marketo Masterclass, London MUG May 22 2024.pdfAdele Miller
Can't make Adobe Summit in Vegas? No sweat because the EMEA Marketo Engage Champions are coming to London to share their Summit sessions, insights and more!
This is a MUG with a twist you don't want to miss.
Enhancing Project Management Efficiency_ Leveraging AI Tools like ChatGPT.pdfJay Das
With the advent of artificial intelligence or AI tools, project management processes are undergoing a transformative shift. By using tools like ChatGPT, and Bard organizations can empower their leaders and managers to plan, execute, and monitor projects more effectively.
Top Features to Include in Your Winzo Clone App for Business Growth (4).pptxrickgrimesss22
Discover the essential features to incorporate in your Winzo clone app to boost business growth, enhance user engagement, and drive revenue. Learn how to create a compelling gaming experience that stands out in the competitive market.
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
Navigating the Metaverse: A Journey into Virtual Evolution"Donna Lenk
Join us for an exploration of the Metaverse's evolution, where innovation meets imagination. Discover new dimensions of virtual events, engage with thought-provoking discussions, and witness the transformative power of digital realms."
Exploring Innovations in Data Repository Solutions - Insights from the U.S. G...Globus
The U.S. Geological Survey (USGS) has made substantial investments in meeting evolving scientific, technical, and policy driven demands on storing, managing, and delivering data. As these demands continue to grow in complexity and scale, the USGS must continue to explore innovative solutions to improve its management, curation, sharing, delivering, and preservation approaches for large-scale research data. Supporting these needs, the USGS has partnered with the University of Chicago-Globus to research and develop advanced repository components and workflows leveraging its current investment in Globus. The primary outcome of this partnership includes the development of a prototype enterprise repository, driven by USGS Data Release requirements, through exploration and implementation of the entire suite of the Globus platform offerings, including Globus Flow, Globus Auth, Globus Transfer, and Globus Search. This presentation will provide insights into this research partnership, introduce the unique requirements and challenges being addressed and provide relevant project progress.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
Unleash Unlimited Potential with One-Time Purchase
BoxLang is more than just a language; it's a community. By choosing a Visionary License, you're not just investing in your success, you're actively contributing to the ongoing development and support of BoxLang.
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
Enhancing Research Orchestration Capabilities at ORNL.pdf
Perl Programming - 04 Programming Database
1. 04 - Perl Programming
Database
156
Danairat T.
Line ID: Danairat
FB: Danairat Thanabodithammachari
+668-1559-1446
2. Danairat T.
Perl at the client Side
Perl and Database
• Perl uses the DBI (Database Interface) for DB integration
purpose. The DBI is database-independent, which means that
it can work with vendor specific DBD (Database Driver) for any
of database, such as Postgres, mySQL, Oracle, Sybase,
Informix, Access, ODBC, etc.
157
Perl Program DBI
DBD::Pg
DBD::Oracle
Postgres
Oracle
4. Danairat T.
Perl Modules for Database
159
• Query your existing modules using command line:-
• perl -MFile::Find=find -MFile::Spec::Functions -lwe 'find { wanted =>
sub { print canonpath $_ if /.pmz/ }, no_chdir => 1 },
• The DBI is normally located at
/usr/perl5/vendor_perl/5.8.4/i86pc-solaris-64int/
• The DBDs are normally located at
/usr/perl5/vendor_perl/5.8.4/i86pc-solaris-64int/DBD/
5. Danairat T.
Perl Modules for Database
160
• Install the Perl DBI:-
1. Download DBI module “DBI-1.58.tar.gz” from
www.cpan.org
2. Unzip as a directory and cd to the directory
gzip -cd DBI-1.58.tar.gz | tar xf -
3. Run standard perl module installation
i. perl Makefile.PL
ii. make
iii. make test
iv. make install
4. Finished.
6. Danairat T.
Perl Modules for Database
161
• Install the Perl DBD for Postgres Database:-
1. Download DBD module “DBD-Pg-2.15.1.tar.gz” from
www.cpan.org
2. Unzip as a directory and cd to the directory
gzip -cd DBD-Pg-2.15.1.tar.gz | tar xf -
3. Run standard perl module installation
i. perl Makefile.PL
ii. make
iii. make test
iv. make install
4. Finished.
7. Danairat T.
Prepare Database Environment (Postgres)
162
1. Verify the system has the Postgres using pkginfo
• bash-3.00# pkginfo -i SUNWpostgr-server
• system SUNWpostgr-server The programs needed to create and run a PostgreSQL 8.1.1 7 server
• bash-3.00# pkginfo -i SUNWpostgr
• system SUNWpostgr PostgreSQL 8.1.17 client programs and libraries
• bash-3.00# pkginfo -i SUNWpostgr-libs
• system SUNWpostgr-libs The shared libraries required for any PostgreSQL 8.1.17 clients
• bash-3.00# pkginfo -i SUNWpostgr-server-data
• system SUNWpostgr-server-data The data directories needed to create and run a PostgreSQ L 8.1.8
server
• bash-3.00# pkginfo -i SUNWpostgr-contrib
• system SUNWpostgr-contrib Contributed source and binaries distributed with PostgreSQL 8.1.17
• bash-3.00# pkginfo -i SUNWpostgr-devel
• system SUNWpostgr-devel PostgreSQL 8.1.17 development header files and libraries
• bash-3.00# pkginfo -i SUNWpostgr-docs
• system SUNWpostgr-docs Extra documentation for PostgreSQL 8.1.8
8. Danairat T.
Prepare Database Environment (Postgres)
163
2. Create OS user for Postgres and Initialize the database:-
• #useradd -c 'PostgreSQL user' -d /export/home/postgres -m -s /bin/bash postgres
• #chown -R postgres:postgres /var/lib/pgsql
• #su - postgres
• bash-3.00$ initdb -D /var/lib/pgsql/data/mydata
• The files belonging to this database system will be owned by user "postgres".
• This user must also own the server process.
• The database cluster will be initialized with locale C.
• creating directory /var/lib/pgsql/data/mydata ... ok
• creating directory /var/lib/pgsql/data/mydata/global ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_xlog ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_xlog/archive_status ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_clog ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_subtrans ... ok
9. Danairat T.
Prepare Database Environment (Postgres)
164
• Create OS user for Postgres and Initialize the database (continue):-
• creating directory /var/lib/pgsql/data/mydata/pg_twophase ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_multixact/members ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_multixact/offsets ... ok
• creating directory /var/lib/pgsql/data/mydata/base ... ok
• creating directory /var/lib/pgsql/data/mydata/base/1 ... ok
• creating directory /var/lib/pgsql/data/mydata/pg_tblspc ... ok
• selecting default max_connections ... 100
• selecting default shared_buffers ... 1000
• creating configuration files ... ok
• creating template1 database in /var/lib/pgsql/data/mydata/base/1 ... ok
• initializing pg_authid ... ok
• enabling unlimited row size for system tables ... ok
• initializing dependencies ... ok
• creating system views ... ok
• loading pg_description ... ok
• creating conversions ... ok
• setting privileges on built-in objects ... ok
10. Danairat T.
Prepare Database Environment (Postgres)
165
• Create OS user for Postgres and Initialize the database (continue):-
• creating information schema ... ok
• vacuuming database template1 ... ok
• copying template1 to template0 ... ok
• copying template1 to postgres ... ok
• WARNING: enabling "trust" authentication for local connections
• You can change this by editing pg_hba.conf or using the -A option the
• next time you run initdb.
• Success. You can now start the database server using:
• postmaster -D /var/lib/pgsql/data/mydata
• or
• pg_ctl -D /var/lib/pgsql/data/mydata -l logfile start
12. Danairat T.
Prepare Database Environment (Postgres)
167
5. Access the database:-
-bash-3.00$ psql postgres
Welcome to psql 8.1.17, the PostgreSQL interactive terminal.
Type: copyright for distribution terms
h for help with SQL commands
? for help with psql commands
g or terminate with semicolon to execute query
q to quit
postgres=#
Postgres Command Line UI Postgres Graphical UI
13. Danairat T.
Prepare Database Environment (Postgres)
168
6(a). Select the database:-
postgres=# select * from megaliths;
id | name | location | description | site_type_id | mapref
----+-------------+---------------+--------------------------+--------------+------------
1 | Stonehenge | Wiltshire | "Stone Circle and Henge" | 1 | SU 123 422
2 | Avebury | Wiltshire | "Stone Circle and Henge" | 2 | SU 103 700
3 | Sunhoney | Aberdeenshire | "Recumbent Stone Circle" | 2 | NJ 716 058
4 | Lundin | Links Fife | "Four Poster" | 3 | NO 404 027
5 | Callanish I | Western Isles | "Stone Circle and Rows" | 3 | NB 213 330
(5 rows)
Postgres Command Line UI
15. Danairat T.
Prepare Database Environment (Postgres)
170
7. Shutdown the Postgres database:-
bash-3.00$ pg_ctl -D /var/lib/pgsql/data/mydata stop
waiting for postmaster to shut down.... done
postmaster stopped
16. Danairat T.
Database Handler
171
• Using the DBI to connect to Postgress and return to the DB handler
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
# db name is postgres, user is postgres and password is none
if (my $dbh = DBI->connect("dbi:Pg:dbname=postgres", "postgres", "")) {
print "Connected.n";
} else {
print "Could not connect to DB $DBI::errstrn";
}
exit(0);
DBHandlerEx01.pl
Results:-
Connected.
17. Danairat T.
Database Statement
172
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
# Connect to the database
if (my $dbh = DBI->connect("dbi:Pg:dbname=postgres", "postgres", "")) {
# Statement from DB handler
my $sth = $dbh->prepare( "SELECT * FROM megaliths" )
or die "Can't prepare SQL statement: $DBI::errstrn";
# Execute the statement in the database
$sth->execute()
or die "Can't execute SQL statement: $DBI::errstrn";
$sth->finish()
or die "Error finish the SQL statement: $DBI::errstrn";
# Disconnect database
$dbh->disconnect()
or warn "Error disconnecting: $DBI::errstrn";
} else {
print "Could not connect to DB $DBI::errstrn";
}
exit(0);
DBSthEx01.pl
Results:-
<no displayed result>
18. Danairat T.
Database Manipulation
173
• Retrieve database records
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
# 1. Connect to the database
if (my $dbh = DBI->connect("dbi:Pg:dbname=postgres", "postgres", "")) {
my $sth = $dbh->prepare( "SELECT * FROM megaliths" )
or die "Can't prepare SQL statement: $DBI::errstrn";
# 2. Execute the statement in the database
$sth->execute() or die "Can't execute SQL statement: $DBI::errstrn";
# 3. Retrieve DB records
my @row;
while ( @row = $sth->fetchrow_array() ) {
my ($id, $name, $location, $desc, $siteId, $mapRef) = @row;
print "id=$id, name=$name, location=$location, mapRef=$mapRefn";
}
# 4. Finish the statement
$sth->finish() or die "Error finish the SQL statement: $DBI::errstrn";
# 5. Disconnect database
$dbh->disconnect()
or warn "Error disconnecting: $DBI::errstrn";
} else {
print "Could not connect to DB $DBI::errstrn";
}
exit(0);
DBSelectEx01.pl
Results:-
id=1, name=Stonehenge, location=Wiltshire, mapRef=SU 123 422
id=2, name=Avebury, location=Wiltshire, mapRef=SU 103 700
id=3, name=Sunhoney, location=Aberdeenshire, mapRef=NJ 716 058
id=4, name=Lundin, location=Links Fife, mapRef=NO 404 027
id=5, name=Callanish I, location=Western Isles, mapRef=NB 213 330
19. Danairat T.
Database Manipulation
174
• Add new records
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $newRecord = "6, Stone Big, The Area 51, The most biggest stone, 3, CA 8759 993";
# Connect to the database
if (my $dbh = DBI->connect("dbi:Pg:dbname=postgres", "postgres", "")) {
# Prepare the statement
my $sth = $dbh->prepare( "INSERT INTO megaliths (id, name, location, description, site_type_id,
mapref) VALUES (?,?,?,?,?,?)" )
or die "Can't prepare SQL statement: $DBI::errstrn";
# Extract the input to variable
my ($id, $name, $location, $description, $site_type_id, $mapref) = split (/,/,$newRecord);
# Execute the statement in the database
$sth->execute($id, $name, $location, $description, $site_type_id, $mapref)
or die "Can't execute SQL statement: $DBI::errstrn";
# Finish the statement
$sth->finish() or die "Error finish the SQL statement: $DBI::errstrn";
# Disconnect database
$dbh->disconnect()
or warn "Error disconnecting: $DBI::errstrn";
} else {
print "Could not connect to DB $DBI::errstrn";
}
exit(0);
DBAddEx01.pl
Results:-
<please see the db result there is one new record inserted>
20. Danairat T.
Database Manipulation
175
• Update records from table
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $newRecord = "6, Stone Biggest, The Area 51, The most biggest stone, 3, CA 8759 993";
# Connect to the database
if (my $dbh = DBI->connect("dbi:Pg:dbname=postgres", "postgres", "")) {
# Prepare the statement
my $sth = $dbh->prepare( "UPDATE megaliths SET name=? WHERE (id=?)" )
or die "Can't prepare SQL statement: $DBI::errstrn";
# Extract the input to variable
my ($id, $name, @others) = split (/,/,$newRecord);
# Execute the statement in the database
$sth->execute($name, $id)
or die "Can't execute SQL statement: $DBI::errstrn";
# Finish the statement
$sth->finish() or die "Error finish the SQL statement: $DBI::errstrn";
# Disconnect database
$dbh->disconnect()
or warn "Error disconnecting: $DBI::errstrn";
} else {
print "Could not connect to DB $DBI::errstrn";
}
exit(0);
DBUpdateEx01.pl
Results:-
<please see the db result >
21. Danairat T.
Database Manipulation
176
• Delete records from table
#!/usr/bin/perl
use strict;
use warnings;
use DBI;
my $newRecord = "6, Stone Biggest, The Area 51, The most biggest stone, 3, CA 8759 993";
# Connect to the database
if (my $dbh = DBI->connect("dbi:Pg:dbname=postgres", "postgres", "")) {
# Prepare the statement
my $sth = $dbh->prepare( "DELETE FROM megaliths WHERE (id=?)" )
or die "Can't prepare SQL statement: $DBI::errstrn";
# Extract the input to variable
my ($id, @others) = split (/,/,$newRecord);
# Execute the statement in the database
$sth->execute($id)
or die "Can't execute SQL statement: $DBI::errstrn";
# Finish the statement
$sth->finish() or die "Error finish the SQL statement: $DBI::errstrn";
# Disconnect database
$dbh->disconnect()
or warn "Error disconnecting: $DBI::errstrn";
} else {
print "Could not connect to DB $DBI::errstrn";
}
exit(0);
DBDeleteEx01.pl
Results:-
<please see the db result >
22. Danairat T.
Line ID: Danairat
FB: Danairat Thanabodithammachari
+668-1559-1446
Thank you