SlideShare a Scribd company logo
1 of 8
Download to read offline
Space Time Visualization Using Python 
I- 190 
State Hwy 31 
Delaware St 
Python Script: 
# arcpy 
import arcpy 
# environment and parameters for the tool 
arcpy.env.workspace = arcpy.GetParameterAsText(0) 
GPSinFeature = arcpy.GetParameterAsText(1) 
DateTimeS = arcpy.GetParameterAsText(2) 
OutOut4 = arcpy.GetParameterAsText(3) 
#set local variable 
FieldName1 = "hour" 
eldPrecision = 9 
FieldName2 = minute 
eldPrecision = 9 
FieldName3 = second 
eldPrecision = 9 
FieldName4 = TimeXAll 
FieldName5 = TimeXMin 
FieldName7 = TimeSec 
# add feild 3 times for three new elds (in gps 
shapele) 
arcpy.AddField_management(GPSinFeature, 
FieldName1, LONG, eldPrecision) 
arcpy.AddField_management(GPSinFeature, 
FieldName2, LONG, eldPrecision) 
arcpy.AddField_management(GPSinFeature, 
FieldName3, LONG, eldPrecision) 
# calculate time 
# creat a refrence eld which includes just 
time not date 
arcpy.AddField_management(GPSinFeature 
, FieldName4, TEXT, eldPrecision) 
# New: Convert any type of date eld to one 
standard date eld 
arcpy.ConvertTimeField_management 
(GPSinFeature,DateTimeS,yyyy-MM-dd 
HH:mm:ss;1033;;,HHMMSS,TEXT, 
yyyyMMddHHmmss) 
# New: copy the hh:mm:ss part of the 
Standard DateTimeS (just created) 
to the eld just created (TimeXAll) 
arcpy.CalculateField_management 
(GPSinFeature,TimeXAll,Mid( [HHMMSS], 
9, 14),VB,#) 
# add a eld to be able to copy the source 
for minute 
arcpy.AddField_management(GPSinFeature, 
FieldName5, TEXT, eldPrecision) 
# New: calculate the source eld for minute and 
sec 
arcpy.CalculateField_management(GPSin 
Feature,TimeXMin,Right( [TimeXAll], 4) 
,VB,#)#calculate hour 
arcpy.CalculateField_management 
(GPSinFeature,hour,Left( [TimeXAll], 2) 
,VB,#)#calculate min 
arcpy.CalculateField_management 
(GPSinFeature, 
minute,Left( [TimeXMin], 2),VB,#) 
#calculate sec 
arcpy.CalculateField_management 
(GPSinFeature,second,Right( [TimeXMin] 
, 2),VB,#) 
# create a eld for time based on seconds 
arcpy.AddField_management 
(GPSinFeature, FieldName7, LONG, 
eldPrecision) 
#add hour + minute+ sec and calculate 
the time based on sec 
arcpy.CalculateField_management 
(GPSinFeature,TimeSec,[hour] *3600 
+ [minute]*60 + [second],VB,#) 
# points to lines 
#parameters 
outFeature = Line_ready.shp 
outFeature2 = splited 
sortField = TimeSec 
radious = 2 Meter 
# create line based on sort led 
arcpy.PointsToLine_management 
(GPSinFeature, outFeature, , sortField) 
#spilit line based on point with .09 radious 
arcpy.SplitLineAtPoint_management 
(outFeature,GPSinFeature, outFeature2 
, radious) 
# extract two numbers based on time and 
make two elds 
#in order to do that we need to copy 
sample.shp and use it in spatial join script 
# Set local variables 
outFeatureClass = ForSpatial 
# Execute FeatureClassToFeatureClass 
arcpy.FeatureClassToFeatureClass_ 
conversion(GPSinFeature, arcpy.env. 
workspace, outFeatureClass) 
#parameters for spatial join 
OutPut3 = MinMaxSec 
inFeature8 = splited.shp 
# Replace a layer/table view name with a 
path to a dataset (which can be a layer le) 
or create the layer/table view within the script 
# The following inputs are layers or table views 
: splited, 5 
arcpy.SpatialJoin_analysis(inFeature8, 
GPSinFeature,OutPut3,JOIN_ONE_TO_ONE, 
KEEP_ALL,max max true false 
false 50 Double 0 0 ,Max,#,ForSpatial.shp, 
TimeSec,-1,-1;min min true false false 50 
Double 0 0 ,Min,#,ForSpatial.shp,TimeSec 
,-1,-1,INTERSECT,#,#) 
Sheridan Dr State Hwy 324 
Main St 
I- 90 
Shawnee Rd 
River Rd 
Saunders Settlement 
State Hwy 265 
State Hwy 266 
State Hwy 182 
Harlem Rd 
Erie Ave 
Townline Rd 
Main St 
State Hwy 198 
Main St 
Main St 
# feature to 3d 
arcpy.CheckOutExtension('3D') 
# parameters 
arcpy.FeatureTo3DByAttribute_3d 
(MinMaxSec.shp,OutOut4,min,max) 
PrimaryRoads 
Destinations 
GPS Track 
Water 
Erie County Block Groups 
Miles 
0.25 0.5 1 1.5 2
Seyed 
Saeid 
Saadatmand 
Independent 
Study, 
Summer 
2014 
Prof. 
Li 
Yin 
Space 
Time 
Visualization 
Using 
Python 
I 
conducted 
a 
pilot 
GPS 
study 
for 
my 
Master’s 
project. 
For 
the 
pilot 
study, 
I 
asked 
a 
student 
to 
travel 
to 
the 
city 
of 
Niagara 
Falls, 
with 
a 
GPS 
device 
in 
hand, 
and 
visit 
the 
city. 
The 
“Space 
Time 
Visualization 
Using 
Python” 
project, 
visualizes 
the 
GPS 
data 
collected 
by 
the 
student. 
For 
this 
project, 
a 
tool 
for 
ArcToolbox 
has 
been 
designed 
using 
Python 
scripting 
in 
PythonWin 
environment. 
This 
tool 
can 
visualize 
any 
GPS 
data 
in 
space-­‐time 
environment. 
The 
visualization 
below 
is 
a 
product 
of 
this 
tool 
showing 
the 
route 
in 
3D 
environment. 
X-­‐ 
and 
Y-­‐ 
axes 
are 
geographic 
coordinates 
and 
Z 
is 
the 
time 
value. 
Picture 
below 
shows 
the 
visualization 
result:
Map 
below 
shows 
the 
route, 
which 
was 
used 
to 
get 
to 
the 
City 
of 
Niagara 
Falls: 
These 
tables 
show 
the 
timetable 
of 
the 
trip:
As 
can 
be 
seen 
from 
the 
first 
table, 
two 
major 
stops 
has 
been 
recorded 
in 
the 
City 
Hall 
and 
Niagara 
Center. 
In 
the 
visualization 
graph 
these 
two 
destinations 
are 
shown 
as 
two 
horizontal 
lines. 
The 
GPS 
visualization 
tool 
can 
be 
used 
not 
only 
for 
this 
dataset 
but 
also 
for 
any 
type 
of 
GPS 
data 
sample. 
See 
the 
tool 
instruction 
in 
the 
next 
page:
Instruction: 
The 
window 
below 
shows 
the 
tool 
environment. 
In 
order 
to 
visualize 
GPS 
data 
please 
follow 
the 
steps: 
Create 
the 
3D 
line 
feature 
class: 
-­‐ Define 
a 
Geoprocessing 
environment: 
You 
need 
to 
define 
a 
path 
and 
select 
an 
empty 
folder. 
Arcmap 
will 
create 
four 
different 
shapefiles 
in 
this 
folder 
in 
order 
to 
create 
the 
final 
shapefile. 
-­‐ Select 
an 
input 
GPS 
point 
feature 
class: 
You 
need 
to 
select 
a 
GPS 
point 
feature 
class, 
which 
includes 
a 
time 
field. 
-­‐ Type 
the 
name 
of 
the 
“time 
field”: 
Check 
the 
attribute 
table 
of 
the 
GPS 
point 
feature 
class 
and 
copy 
the 
name 
of 
the 
time 
field. 
-­‐ Define 
a 
path 
for 
your 
output 
shapefile 
and 
type 
a 
name 
for 
the 
final 
line 
feature 
class: 
It 
does 
not 
matter 
if 
you 
define 
the 
same 
path 
as 
geoprocessing 
environment 
or 
any 
other 
path. 
Use 
ArcScene 
for 
visualization: 
-­‐ Copy 
the 
3D 
line 
created 
in 
previous 
step 
in 
ArcScene. 
-­‐ Change 
the 
background 
color. 
-­‐ Change 
the 
symbology 
of 
the 
line. 
-­‐ Calculate 
exaggeration 
if 
you 
are 
not 
able 
to 
recognize 
height 
difference.
Instruction 
to 
use 
the 
attached 
data: 
In 
the 
zip 
folder 
you 
find 
these 
main 
folders: 
-­‐ Draft_Scripts_and_Tools: 
All 
the 
previous 
scripts 
and 
tools 
are 
saved 
in 
this 
folder. 
You 
do 
not 
need 
to 
use 
this 
folder; 
it 
is 
included 
just 
to 
show 
the 
development 
process. 
-­‐ GPS_Sample: 
Two 
GPS 
samples 
are 
included 
in 
this 
folder. 
These 
two 
GPS 
datasets 
have 
different 
time 
fields. 
You 
can 
use 
both 
data 
samples 
to 
test 
the 
tool. 
The 
tool 
is 
developed 
in 
a 
way 
to 
work 
with 
any 
GPS 
time 
field. 
-­‐ GPS_Visualiztion.tbx: 
You 
can 
find 
the 
tool 
under 
this 
toolbox. 
-­‐ Results: 
You 
can 
find 
the 
3D 
lines 
created 
from 
two 
data 
samples. 
You 
can 
add 
these 
two 
shapefiles 
one 
by 
one 
to 
ArcScene 
environment 
and 
see 
the 
3D 
view. 
-­‐ Geo_ENV: 
You 
can 
use 
this 
empty 
folder 
as 
your 
Geoprocessing 
Environment 
while 
testing 
the 
tool. 
-­‐ 3DViews.sxd 
This 
shortcut 
to 
ArcScence 
can 
be 
used 
in 
order 
to 
see 
the 
final 
results 
in 
3D. 
Open 
the 
file 
and 
see 
the 
GPS 
visualization 
of 
the 
data 
collected 
for 
my 
final 
project. 
-­‐ Test.py: 
This 
file 
is 
the 
script 
which 
was 
written 
using 
PythonWin. 
To 
test 
the 
tool: 
-­‐ Open 
arcpam 
-­‐ Navigate 
to 
the 
address 
you 
saved 
GPS.zip 
-­‐ Unzip 
the 
file 
-­‐ Open 
GPS 
visualization 
tool 
-­‐ Navigate 
to 
GPS 
folder 
and 
select 
GEO_ENV 
as 
your 
geoprocessing 
environment 
-­‐ Navigate 
to 
GPS_Sample 
and 
select 
“five.shp” 
as 
input 
data 
-­‐ Type 
“DateTimeS” 
in 
date 
field 
-­‐ Define 
a 
path 
and 
a 
name 
for 
the 
output 
(3D) 
-­‐ Open 
ArcScene 
and 
add 
“3D” 
to 
ArcScene 
-­‐ Change 
exagereation 
till 
you 
see 
the 
a 
3D 
line
Python 
Script: 
# arcpy 
import arcpy 
# environment and parameters for the tool 
arcpy.env.workspace = arcpy.GetParameterAsText(0) 
GPSinFeature = arcpy.GetParameterAsText(1) 
DateTimeS = arcpy.GetParameterAsText(2) 
OutOut4 = arcpy.GetParameterAsText(3) 
#set local variable 
FieldName1 = hour 
fieldPrecision = 9 
FieldName2 = minute 
fieldPrecision = 9 
FieldName3 = second 
fieldPrecision = 9 
FieldName4 = TimeXAll 
FieldName5 = TimeXMin 
FieldName7 = TimeSec 
# add feild 3 times for three new fields (in gps shapefile) 
arcpy.AddField_management(GPSinFeature, FieldName1, LONG, fieldPrecision) 
arcpy.AddField_management(GPSinFeature, FieldName2, LONG, fieldPrecision) 
arcpy.AddField_management(GPSinFeature, FieldName3, LONG, fieldPrecision) 
# calculate time 
# creat a refrence field which includes just time not date 
arcpy.AddField_management(GPSinFeature, FieldName4, TEXT, fieldPrecision) 
# New: Convert any type of date field to one standard date field 
arcpy.ConvertTimeField_management(GPSinFeature,DateTimeS,yyyy-MM-dd 
HH:mm:ss;1033;;,HHMMSS,TEXT,yyyyMMddHHmmss) 
# New: copy the hh:mm:ss part of the Standard DateTimeS (just created) to the field just created 
(TimeXAll) 
arcpy.CalculateField_management(GPSinFeature,TimeXAll,Mid( [HHMMSS], 9, 14),VB,#) 
# add a field to be able to copy the source for minute 
arcpy.AddField_management(GPSinFeature, FieldName5, TEXT, fieldPrecision) 
# New: calculate the source field for minute and sec 
arcpy.CalculateField_management(GPSinFeature,TimeXMin,Right( [TimeXAll], 4),VB,#) 
#calculate hour 
arcpy.CalculateField_management(GPSinFeature,hour,Left( [TimeXAll], 2),VB,#) 
#calculate min 
arcpy.CalculateField_management(GPSinFeature,minute,Left( [TimeXMin], 2),VB,#) 
#calculate sec 
arcpy.CalculateField_management(GPSinFeature,second,Right( [TimeXMin], 2),VB,#) 
# create a field for time based on seconds 
arcpy.AddField_management(GPSinFeature, FieldName7, LONG, fieldPrecision) 
#add hour + minute+ sec and calculate the time based on sec 
arcpy.CalculateField_management(GPSinFeature,TimeSec,[hour] *3600 + [minute]*60 + 
[second],VB,#) 
# points to lines 
#parameters 
outFeature = Line_ready.shp 
outFeature2 = splited 
sortField = TimeSec 
radious = 2 Meter 
# create line based on sort filed 
arcpy.PointsToLine_management(GPSinFeature, outFeature, , sortField) 
#spilit line based on point with .09 radious 
arcpy.SplitLineAtPoint_management(outFeature,GPSinFeature, outFeature2, radious) 
# extract two numbers based on time and make two fields
#in order to do that we need to copy sample.shp and use it in spatial join script 
# Set local variables 
outFeatureClass = ForSpatial 
# Execute FeatureClassToFeatureClass 
arcpy.FeatureClassToFeatureClass_conversion(GPSinFeature, arcpy.env.workspace, 
outFeatureClass) 
#parameters for spatial join 
OutPut3 = MinMaxSec 
inFeature8 = splited.shp 
# Replace a layer/table view name with a path to a dataset (which can be a layer file) or create the 
layer/table view within the script 
# The following inputs are layers or table views: splited, 5 
arcpy.SpatialJoin_analysis(inFeature8,GPSinFeature,OutPut3,JOIN_ONE_TO_ONE,KEEP_ALL, 
max max true false false 50 Double 0 0 ,Max,#,ForSpatial.shp,TimeSec,-1,-1;min min true false 
false 50 Double 0 0 ,Min,#,ForSpatial.shp,TimeSec,-1,-1,INTERSECT,#,#) 
# feature to 3d 
arcpy.CheckOutExtension('3D') 
# parameters 
arcpy.FeatureTo3DByAttribute_3d(MinMaxSec.shp,OutOut4,min,max)

More Related Content

What's hot

OmpSs – improving the scalability of OpenMP
OmpSs – improving the scalability of OpenMPOmpSs – improving the scalability of OpenMP
OmpSs – improving the scalability of OpenMPIntel IT Center
 
Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr
Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr
Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr AgileNCR2013
 
Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...
Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...
Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...InfluxData
 
lecture 6
lecture 6lecture 6
lecture 6sajinsc
 
JEP 286 Local-Variable Type Inference - Ted's Tool Time
JEP 286 Local-Variable Type Inference - Ted's Tool TimeJEP 286 Local-Variable Type Inference - Ted's Tool Time
JEP 286 Local-Variable Type Inference - Ted's Tool TimeTed Vinke
 
Flux and InfluxDB 2.0
Flux and InfluxDB 2.0Flux and InfluxDB 2.0
Flux and InfluxDB 2.0InfluxData
 
Chap5 - ADSP 21K Manual
Chap5 - ADSP 21K ManualChap5 - ADSP 21K Manual
Chap5 - ADSP 21K ManualSethCopeland
 
Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...
Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...
Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...Databricks
 
10. Kapusta, Stofanak - eGlu
10. Kapusta, Stofanak - eGlu10. Kapusta, Stofanak - eGlu
10. Kapusta, Stofanak - eGluMobCon
 
Mapreduce: Theory and implementation
Mapreduce: Theory and implementationMapreduce: Theory and implementation
Mapreduce: Theory and implementationSri Prasanna
 
Heaps
HeapsHeaps
HeapsIIUM
 
Introduction to spatial data analysis in r
Introduction to spatial data analysis in rIntroduction to spatial data analysis in r
Introduction to spatial data analysis in rRichard Wamalwa
 
CNIT 127 Ch 5: Introduction to heap overflows
CNIT 127 Ch 5: Introduction to heap overflowsCNIT 127 Ch 5: Introduction to heap overflows
CNIT 127 Ch 5: Introduction to heap overflowsSam Bowne
 
2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...
2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...
2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...DB Tsai
 
Realtime Computation with Storm
Realtime Computation with StormRealtime Computation with Storm
Realtime Computation with Stormboorad
 

What's hot (20)

OmpSs – improving the scalability of OpenMP
OmpSs – improving the scalability of OpenMPOmpSs – improving the scalability of OpenMP
OmpSs – improving the scalability of OpenMP
 
Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr
Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr
Agile NCR 2013- Anirudh Bhatnagar - Hadoop unit testing agile ncr
 
Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...
Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...
Barbara Nelson [InfluxData] | How Can I Put That Dashboard in My App? | Influ...
 
lecture 6
lecture 6lecture 6
lecture 6
 
JEP 286 Local-Variable Type Inference - Ted's Tool Time
JEP 286 Local-Variable Type Inference - Ted's Tool TimeJEP 286 Local-Variable Type Inference - Ted's Tool Time
JEP 286 Local-Variable Type Inference - Ted's Tool Time
 
Hadoop map reduce concepts
Hadoop map reduce conceptsHadoop map reduce concepts
Hadoop map reduce concepts
 
Flux and InfluxDB 2.0
Flux and InfluxDB 2.0Flux and InfluxDB 2.0
Flux and InfluxDB 2.0
 
Chap5 - ADSP 21K Manual
Chap5 - ADSP 21K ManualChap5 - ADSP 21K Manual
Chap5 - ADSP 21K Manual
 
Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...
Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...
Cardinality Estimation through Histogram in Apache Spark 2.3 with Ron Hu and ...
 
10. Kapusta, Stofanak - eGlu
10. Kapusta, Stofanak - eGlu10. Kapusta, Stofanak - eGlu
10. Kapusta, Stofanak - eGlu
 
Mapreduce: Theory and implementation
Mapreduce: Theory and implementationMapreduce: Theory and implementation
Mapreduce: Theory and implementation
 
Heaps
HeapsHeaps
Heaps
 
Storm is coming
Storm is comingStorm is coming
Storm is coming
 
A Step Towards Data Orientation
A Step Towards Data OrientationA Step Towards Data Orientation
A Step Towards Data Orientation
 
Introduction to spatial data analysis in r
Introduction to spatial data analysis in rIntroduction to spatial data analysis in r
Introduction to spatial data analysis in r
 
Introduction to Data Oriented Design
Introduction to Data Oriented DesignIntroduction to Data Oriented Design
Introduction to Data Oriented Design
 
CellCoverage
CellCoverageCellCoverage
CellCoverage
 
CNIT 127 Ch 5: Introduction to heap overflows
CNIT 127 Ch 5: Introduction to heap overflowsCNIT 127 Ch 5: Introduction to heap overflows
CNIT 127 Ch 5: Introduction to heap overflows
 
2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...
2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...
2015-06-15 Large-Scale Elastic-Net Regularized Generalized Linear Models at S...
 
Realtime Computation with Storm
Realtime Computation with StormRealtime Computation with Storm
Realtime Computation with Storm
 

Viewers also liked

Viewers also liked (14)

M De Roo resume April 2015 FINAL
M De Roo resume April 2015 FINALM De Roo resume April 2015 FINAL
M De Roo resume April 2015 FINAL
 
Consumer Psychology
Consumer PsychologyConsumer Psychology
Consumer Psychology
 
Skilling the Population_v3_3th dec2015
Skilling the Population_v3_3th dec2015Skilling the Population_v3_3th dec2015
Skilling the Population_v3_3th dec2015
 
Musc371 Timeline
Musc371 TimelineMusc371 Timeline
Musc371 Timeline
 
Mercy in Psychology
Mercy in PsychologyMercy in Psychology
Mercy in Psychology
 
SSF concept Note_Tata Motors
SSF concept Note_Tata MotorsSSF concept Note_Tata Motors
SSF concept Note_Tata Motors
 
Does Age Bring Wisdom?
Does Age Bring Wisdom?Does Age Bring Wisdom?
Does Age Bring Wisdom?
 
Scholarship Program
Scholarship ProgramScholarship Program
Scholarship Program
 
UmerFarooqMunir_Profile_02022015
UmerFarooqMunir_Profile_02022015UmerFarooqMunir_Profile_02022015
UmerFarooqMunir_Profile_02022015
 
for_inkedin.compressed
for_inkedin.compressedfor_inkedin.compressed
for_inkedin.compressed
 
the Pet Shop
the Pet Shopthe Pet Shop
the Pet Shop
 
Business Communications
Business CommunicationsBusiness Communications
Business Communications
 
JSA - CSR Conclave Report - AS VS_290415 rev
JSA - CSR Conclave Report - AS VS_290415 revJSA - CSR Conclave Report - AS VS_290415 rev
JSA - CSR Conclave Report - AS VS_290415 rev
 
K3 peralatan tenaga daya 1
K3 peralatan tenaga daya 1K3 peralatan tenaga daya 1
K3 peralatan tenaga daya 1
 

Similar to SAADATMAND_PYTHON

SF Big Analytics 20191112: How to performance-tune Spark applications in larg...
SF Big Analytics 20191112: How to performance-tune Spark applications in larg...SF Big Analytics 20191112: How to performance-tune Spark applications in larg...
SF Big Analytics 20191112: How to performance-tune Spark applications in larg...Chester Chen
 
r,rstats,r language,r packages
r,rstats,r language,r packagesr,rstats,r language,r packages
r,rstats,r language,r packagesAjay Ohri
 
2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locator2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locatorAlberto Paro
 
2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locator2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locatorAlberto Paro
 
Analytics with arcpy package detailing of Mapping Module Properties and Data ...
Analytics with arcpy package detailing of Mapping Module Properties and Data ...Analytics with arcpy package detailing of Mapping Module Properties and Data ...
Analytics with arcpy package detailing of Mapping Module Properties and Data ...Soumik Chakraborty
 
Automation in ArcGIS using Arcpy
Automation in ArcGIS using ArcpyAutomation in ArcGIS using Arcpy
Automation in ArcGIS using ArcpyGeodata AS
 
Geoprocessing(Building Your Own Tool) and Geostatistical Analysis(An Introdu...
Geoprocessing(Building Your Own Tool)  and Geostatistical Analysis(An Introdu...Geoprocessing(Building Your Own Tool)  and Geostatistical Analysis(An Introdu...
Geoprocessing(Building Your Own Tool) and Geostatistical Analysis(An Introdu...Nepal Flying Labs
 
Big Data - Lab A1 (SC 11 Tutorial)
Big Data - Lab A1 (SC 11 Tutorial)Big Data - Lab A1 (SC 11 Tutorial)
Big Data - Lab A1 (SC 11 Tutorial)Robert Grossman
 
Python en la Plataforma ArcGIS
Python en la Plataforma ArcGISPython en la Plataforma ArcGIS
Python en la Plataforma ArcGISXander Bakker
 
Scylla Summit 2016: Analytics Show Time - Spark and Presto Powered by Scylla
Scylla Summit 2016: Analytics Show Time - Spark and Presto Powered by ScyllaScylla Summit 2016: Analytics Show Time - Spark and Presto Powered by Scylla
Scylla Summit 2016: Analytics Show Time - Spark and Presto Powered by ScyllaScyllaDB
 
Spline 0.3 and Plans for 0.4
Spline 0.3 and Plans for 0.4 Spline 0.3 and Plans for 0.4
Spline 0.3 and Plans for 0.4 Vaclav Kosar
 
R Spatial Analysis using SP
R Spatial Analysis using SPR Spatial Analysis using SP
R Spatial Analysis using SPtjagger
 
Lens: Data exploration with Dask and Jupyter widgets
Lens: Data exploration with Dask and Jupyter widgetsLens: Data exploration with Dask and Jupyter widgets
Lens: Data exploration with Dask and Jupyter widgetsVíctor Zabalza
 
Dache: A Data Aware Caching for Big-Data using Map Reduce framework
Dache: A Data Aware Caching for Big-Data using Map Reduce frameworkDache: A Data Aware Caching for Big-Data using Map Reduce framework
Dache: A Data Aware Caching for Big-Data using Map Reduce frameworkSafir Shah
 
Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...
Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...
Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...Modern Data Stack France
 
Building Location Aware Apps - Get Started with PostGIS, PART II
Building Location Aware Apps - Get Started with PostGIS, PART IIBuilding Location Aware Apps - Get Started with PostGIS, PART II
Building Location Aware Apps - Get Started with PostGIS, PART IIlasmasi
 
Monitoring Spark Applications
Monitoring Spark ApplicationsMonitoring Spark Applications
Monitoring Spark ApplicationsTzach Zohar
 

Similar to SAADATMAND_PYTHON (20)

SF Big Analytics 20191112: How to performance-tune Spark applications in larg...
SF Big Analytics 20191112: How to performance-tune Spark applications in larg...SF Big Analytics 20191112: How to performance-tune Spark applications in larg...
SF Big Analytics 20191112: How to performance-tune Spark applications in larg...
 
r,rstats,r language,r packages
r,rstats,r language,r packagesr,rstats,r language,r packages
r,rstats,r language,r packages
 
2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locator2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locator
 
2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locator2017 02-07 - elastic & spark. building a search geo locator
2017 02-07 - elastic & spark. building a search geo locator
 
Analytics with arcpy package detailing of Mapping Module Properties and Data ...
Analytics with arcpy package detailing of Mapping Module Properties and Data ...Analytics with arcpy package detailing of Mapping Module Properties and Data ...
Analytics with arcpy package detailing of Mapping Module Properties and Data ...
 
Automation in ArcGIS using Arcpy
Automation in ArcGIS using ArcpyAutomation in ArcGIS using Arcpy
Automation in ArcGIS using Arcpy
 
Geoprocessing(Building Your Own Tool) and Geostatistical Analysis(An Introdu...
Geoprocessing(Building Your Own Tool)  and Geostatistical Analysis(An Introdu...Geoprocessing(Building Your Own Tool)  and Geostatistical Analysis(An Introdu...
Geoprocessing(Building Your Own Tool) and Geostatistical Analysis(An Introdu...
 
Big Data - Lab A1 (SC 11 Tutorial)
Big Data - Lab A1 (SC 11 Tutorial)Big Data - Lab A1 (SC 11 Tutorial)
Big Data - Lab A1 (SC 11 Tutorial)
 
Python en la Plataforma ArcGIS
Python en la Plataforma ArcGISPython en la Plataforma ArcGIS
Python en la Plataforma ArcGIS
 
Scala+data
Scala+dataScala+data
Scala+data
 
Scylla Summit 2016: Analytics Show Time - Spark and Presto Powered by Scylla
Scylla Summit 2016: Analytics Show Time - Spark and Presto Powered by ScyllaScylla Summit 2016: Analytics Show Time - Spark and Presto Powered by Scylla
Scylla Summit 2016: Analytics Show Time - Spark and Presto Powered by Scylla
 
Spline 0.3 and Plans for 0.4
Spline 0.3 and Plans for 0.4 Spline 0.3 and Plans for 0.4
Spline 0.3 and Plans for 0.4
 
R Spatial Analysis using SP
R Spatial Analysis using SPR Spatial Analysis using SP
R Spatial Analysis using SP
 
Lens: Data exploration with Dask and Jupyter widgets
Lens: Data exploration with Dask and Jupyter widgetsLens: Data exploration with Dask and Jupyter widgets
Lens: Data exploration with Dask and Jupyter widgets
 
Scala and spark
Scala and sparkScala and spark
Scala and spark
 
Dache: A Data Aware Caching for Big-Data using Map Reduce framework
Dache: A Data Aware Caching for Big-Data using Map Reduce frameworkDache: A Data Aware Caching for Big-Data using Map Reduce framework
Dache: A Data Aware Caching for Big-Data using Map Reduce framework
 
Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...
Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...
Hadoop meetup : HUGFR Construire le cluster le plus rapide pour l'analyse des...
 
Building Location Aware Apps - Get Started with PostGIS, PART II
Building Location Aware Apps - Get Started with PostGIS, PART IIBuilding Location Aware Apps - Get Started with PostGIS, PART II
Building Location Aware Apps - Get Started with PostGIS, PART II
 
Easy R
Easy REasy R
Easy R
 
Monitoring Spark Applications
Monitoring Spark ApplicationsMonitoring Spark Applications
Monitoring Spark Applications
 

SAADATMAND_PYTHON

  • 1. Space Time Visualization Using Python I- 190 State Hwy 31 Delaware St Python Script: # arcpy import arcpy # environment and parameters for the tool arcpy.env.workspace = arcpy.GetParameterAsText(0) GPSinFeature = arcpy.GetParameterAsText(1) DateTimeS = arcpy.GetParameterAsText(2) OutOut4 = arcpy.GetParameterAsText(3) #set local variable FieldName1 = "hour" eldPrecision = 9 FieldName2 = minute eldPrecision = 9 FieldName3 = second eldPrecision = 9 FieldName4 = TimeXAll FieldName5 = TimeXMin FieldName7 = TimeSec # add feild 3 times for three new elds (in gps shapele) arcpy.AddField_management(GPSinFeature, FieldName1, LONG, eldPrecision) arcpy.AddField_management(GPSinFeature, FieldName2, LONG, eldPrecision) arcpy.AddField_management(GPSinFeature, FieldName3, LONG, eldPrecision) # calculate time # creat a refrence eld which includes just time not date arcpy.AddField_management(GPSinFeature , FieldName4, TEXT, eldPrecision) # New: Convert any type of date eld to one standard date eld arcpy.ConvertTimeField_management (GPSinFeature,DateTimeS,yyyy-MM-dd HH:mm:ss;1033;;,HHMMSS,TEXT, yyyyMMddHHmmss) # New: copy the hh:mm:ss part of the Standard DateTimeS (just created) to the eld just created (TimeXAll) arcpy.CalculateField_management (GPSinFeature,TimeXAll,Mid( [HHMMSS], 9, 14),VB,#) # add a eld to be able to copy the source for minute arcpy.AddField_management(GPSinFeature, FieldName5, TEXT, eldPrecision) # New: calculate the source eld for minute and sec arcpy.CalculateField_management(GPSin Feature,TimeXMin,Right( [TimeXAll], 4) ,VB,#)#calculate hour arcpy.CalculateField_management (GPSinFeature,hour,Left( [TimeXAll], 2) ,VB,#)#calculate min arcpy.CalculateField_management (GPSinFeature, minute,Left( [TimeXMin], 2),VB,#) #calculate sec arcpy.CalculateField_management (GPSinFeature,second,Right( [TimeXMin] , 2),VB,#) # create a eld for time based on seconds arcpy.AddField_management (GPSinFeature, FieldName7, LONG, eldPrecision) #add hour + minute+ sec and calculate the time based on sec arcpy.CalculateField_management (GPSinFeature,TimeSec,[hour] *3600 + [minute]*60 + [second],VB,#) # points to lines #parameters outFeature = Line_ready.shp outFeature2 = splited sortField = TimeSec radious = 2 Meter # create line based on sort led arcpy.PointsToLine_management (GPSinFeature, outFeature, , sortField) #spilit line based on point with .09 radious arcpy.SplitLineAtPoint_management (outFeature,GPSinFeature, outFeature2 , radious) # extract two numbers based on time and make two elds #in order to do that we need to copy sample.shp and use it in spatial join script # Set local variables outFeatureClass = ForSpatial # Execute FeatureClassToFeatureClass arcpy.FeatureClassToFeatureClass_ conversion(GPSinFeature, arcpy.env. workspace, outFeatureClass) #parameters for spatial join OutPut3 = MinMaxSec inFeature8 = splited.shp # Replace a layer/table view name with a path to a dataset (which can be a layer le) or create the layer/table view within the script # The following inputs are layers or table views : splited, 5 arcpy.SpatialJoin_analysis(inFeature8, GPSinFeature,OutPut3,JOIN_ONE_TO_ONE, KEEP_ALL,max max true false false 50 Double 0 0 ,Max,#,ForSpatial.shp, TimeSec,-1,-1;min min true false false 50 Double 0 0 ,Min,#,ForSpatial.shp,TimeSec ,-1,-1,INTERSECT,#,#) Sheridan Dr State Hwy 324 Main St I- 90 Shawnee Rd River Rd Saunders Settlement State Hwy 265 State Hwy 266 State Hwy 182 Harlem Rd Erie Ave Townline Rd Main St State Hwy 198 Main St Main St # feature to 3d arcpy.CheckOutExtension('3D') # parameters arcpy.FeatureTo3DByAttribute_3d (MinMaxSec.shp,OutOut4,min,max) PrimaryRoads Destinations GPS Track Water Erie County Block Groups Miles 0.25 0.5 1 1.5 2
  • 2. Seyed Saeid Saadatmand Independent Study, Summer 2014 Prof. Li Yin Space Time Visualization Using Python I conducted a pilot GPS study for my Master’s project. For the pilot study, I asked a student to travel to the city of Niagara Falls, with a GPS device in hand, and visit the city. The “Space Time Visualization Using Python” project, visualizes the GPS data collected by the student. For this project, a tool for ArcToolbox has been designed using Python scripting in PythonWin environment. This tool can visualize any GPS data in space-­‐time environment. The visualization below is a product of this tool showing the route in 3D environment. X-­‐ and Y-­‐ axes are geographic coordinates and Z is the time value. Picture below shows the visualization result:
  • 3. Map below shows the route, which was used to get to the City of Niagara Falls: These tables show the timetable of the trip:
  • 4. As can be seen from the first table, two major stops has been recorded in the City Hall and Niagara Center. In the visualization graph these two destinations are shown as two horizontal lines. The GPS visualization tool can be used not only for this dataset but also for any type of GPS data sample. See the tool instruction in the next page:
  • 5. Instruction: The window below shows the tool environment. In order to visualize GPS data please follow the steps: Create the 3D line feature class: -­‐ Define a Geoprocessing environment: You need to define a path and select an empty folder. Arcmap will create four different shapefiles in this folder in order to create the final shapefile. -­‐ Select an input GPS point feature class: You need to select a GPS point feature class, which includes a time field. -­‐ Type the name of the “time field”: Check the attribute table of the GPS point feature class and copy the name of the time field. -­‐ Define a path for your output shapefile and type a name for the final line feature class: It does not matter if you define the same path as geoprocessing environment or any other path. Use ArcScene for visualization: -­‐ Copy the 3D line created in previous step in ArcScene. -­‐ Change the background color. -­‐ Change the symbology of the line. -­‐ Calculate exaggeration if you are not able to recognize height difference.
  • 6. Instruction to use the attached data: In the zip folder you find these main folders: -­‐ Draft_Scripts_and_Tools: All the previous scripts and tools are saved in this folder. You do not need to use this folder; it is included just to show the development process. -­‐ GPS_Sample: Two GPS samples are included in this folder. These two GPS datasets have different time fields. You can use both data samples to test the tool. The tool is developed in a way to work with any GPS time field. -­‐ GPS_Visualiztion.tbx: You can find the tool under this toolbox. -­‐ Results: You can find the 3D lines created from two data samples. You can add these two shapefiles one by one to ArcScene environment and see the 3D view. -­‐ Geo_ENV: You can use this empty folder as your Geoprocessing Environment while testing the tool. -­‐ 3DViews.sxd This shortcut to ArcScence can be used in order to see the final results in 3D. Open the file and see the GPS visualization of the data collected for my final project. -­‐ Test.py: This file is the script which was written using PythonWin. To test the tool: -­‐ Open arcpam -­‐ Navigate to the address you saved GPS.zip -­‐ Unzip the file -­‐ Open GPS visualization tool -­‐ Navigate to GPS folder and select GEO_ENV as your geoprocessing environment -­‐ Navigate to GPS_Sample and select “five.shp” as input data -­‐ Type “DateTimeS” in date field -­‐ Define a path and a name for the output (3D) -­‐ Open ArcScene and add “3D” to ArcScene -­‐ Change exagereation till you see the a 3D line
  • 7. Python Script: # arcpy import arcpy # environment and parameters for the tool arcpy.env.workspace = arcpy.GetParameterAsText(0) GPSinFeature = arcpy.GetParameterAsText(1) DateTimeS = arcpy.GetParameterAsText(2) OutOut4 = arcpy.GetParameterAsText(3) #set local variable FieldName1 = hour fieldPrecision = 9 FieldName2 = minute fieldPrecision = 9 FieldName3 = second fieldPrecision = 9 FieldName4 = TimeXAll FieldName5 = TimeXMin FieldName7 = TimeSec # add feild 3 times for three new fields (in gps shapefile) arcpy.AddField_management(GPSinFeature, FieldName1, LONG, fieldPrecision) arcpy.AddField_management(GPSinFeature, FieldName2, LONG, fieldPrecision) arcpy.AddField_management(GPSinFeature, FieldName3, LONG, fieldPrecision) # calculate time # creat a refrence field which includes just time not date arcpy.AddField_management(GPSinFeature, FieldName4, TEXT, fieldPrecision) # New: Convert any type of date field to one standard date field arcpy.ConvertTimeField_management(GPSinFeature,DateTimeS,yyyy-MM-dd HH:mm:ss;1033;;,HHMMSS,TEXT,yyyyMMddHHmmss) # New: copy the hh:mm:ss part of the Standard DateTimeS (just created) to the field just created (TimeXAll) arcpy.CalculateField_management(GPSinFeature,TimeXAll,Mid( [HHMMSS], 9, 14),VB,#) # add a field to be able to copy the source for minute arcpy.AddField_management(GPSinFeature, FieldName5, TEXT, fieldPrecision) # New: calculate the source field for minute and sec arcpy.CalculateField_management(GPSinFeature,TimeXMin,Right( [TimeXAll], 4),VB,#) #calculate hour arcpy.CalculateField_management(GPSinFeature,hour,Left( [TimeXAll], 2),VB,#) #calculate min arcpy.CalculateField_management(GPSinFeature,minute,Left( [TimeXMin], 2),VB,#) #calculate sec arcpy.CalculateField_management(GPSinFeature,second,Right( [TimeXMin], 2),VB,#) # create a field for time based on seconds arcpy.AddField_management(GPSinFeature, FieldName7, LONG, fieldPrecision) #add hour + minute+ sec and calculate the time based on sec arcpy.CalculateField_management(GPSinFeature,TimeSec,[hour] *3600 + [minute]*60 + [second],VB,#) # points to lines #parameters outFeature = Line_ready.shp outFeature2 = splited sortField = TimeSec radious = 2 Meter # create line based on sort filed arcpy.PointsToLine_management(GPSinFeature, outFeature, , sortField) #spilit line based on point with .09 radious arcpy.SplitLineAtPoint_management(outFeature,GPSinFeature, outFeature2, radious) # extract two numbers based on time and make two fields
  • 8. #in order to do that we need to copy sample.shp and use it in spatial join script # Set local variables outFeatureClass = ForSpatial # Execute FeatureClassToFeatureClass arcpy.FeatureClassToFeatureClass_conversion(GPSinFeature, arcpy.env.workspace, outFeatureClass) #parameters for spatial join OutPut3 = MinMaxSec inFeature8 = splited.shp # Replace a layer/table view name with a path to a dataset (which can be a layer file) or create the layer/table view within the script # The following inputs are layers or table views: splited, 5 arcpy.SpatialJoin_analysis(inFeature8,GPSinFeature,OutPut3,JOIN_ONE_TO_ONE,KEEP_ALL, max max true false false 50 Double 0 0 ,Max,#,ForSpatial.shp,TimeSec,-1,-1;min min true false false 50 Double 0 0 ,Min,#,ForSpatial.shp,TimeSec,-1,-1,INTERSECT,#,#) # feature to 3d arcpy.CheckOutExtension('3D') # parameters arcpy.FeatureTo3DByAttribute_3d(MinMaxSec.shp,OutOut4,min,max)