DART - improving the science. Dublin 23022012

887 views

Published on

A presentation given by Anthony Beck to the Department of Archaeology at University College Dublin on the 21st February 2012

Published in: Education, Technology
0 Comments
0 Likes
Statistics
Notes
  • Be the first to comment

  • Be the first to like this

No Downloads
Views
Total views
887
On SlideShare
0
From Embeds
0
Number of Embeds
2
Actions
Shares
0
Downloads
5
Comments
0
Likes
0
Embeds 0
No embeds

No notes for slide

DART - improving the science. Dublin 23022012

  1. 1. DART - Improving the scienceunderpinning archaeological detectionAnthony (Ant) BeckTwitter: AntArchDublin– 23rd February 2012School of ComputingFaculty of Engineering
  2. 2. Overview•How do we detect stuff•Why DART•Going back to first principles•DART overview•Data so far•Open Science
  3. 3. OverviewThere is no need to take notes:Slides –Text –http://dl.dropbox.com/u/393477/MindMaps/Events/ConferencesAndWorkshops.htmlThere is every need to ask questionsThe slides and text are release under a Creative Commons byattribution licence.
  4. 4. Archaeological ProspectionWhat is the basis for detection
  5. 5. Archaeological ProspectionWhat is the basis for detectionAt the small scale:• The archaeological record can be considered as a more or less continuous spatial distribution of artefacts, structures, organic remains, chemical residues, topographic variations and other less obvious modifications
  6. 6. Archaeological ProspectionWhat is the basis for detectionAt the large scale:• The distribution is far from even, with large areas where archaeological remains are widely and infrequently dispersed. There are other areas, however, where materials and other remains are abundant and clustered. It is these peaks of abundance that are commonly referred to as sites, features, anomalies (whatever!).
  7. 7. Archaeological ProspectionWhat is the basis for detectionDiscovery requires the detection of one or more siteconstituents.The important points for archaeological detection are:
  8. 8. Archaeological ProspectionArchaeological sites are physical and chemical phenomena
  9. 9. Archaeological ProspectionThere are different kinds of site constituents
  10. 10. Archaeological ProspectionThe abundance and spatial distribution of differentconstituents vary both between and within individual sites
  11. 11. Archaeological ProspectionThese attributes may be masked or accentuated by a varietyof other phenomena
  12. 12. Archaeological ProspectionImportantly from a remote sensing perspective archaeologicalsite do not exhibit consistent spectral signatures
  13. 13. Archaeological ProspectionWhat is the basis for detectionWe detect Contrast:• Between the expression of the remains and the local background valueDirect Contrast:• where a measurement, which exhibits a detectable contrast with its surroundings, is taken directly from an archaeological residue.Proxy Contrast:• where a measurement, which exhibits a detectable contrast with its surroundings, is taken indirectly from an archaeological residue (for example from a crop mark).
  14. 14. Archaeological ProspectionWhat is the basis for detection
  15. 15. Archaeological ProspectionWhat is the basis for detection
  16. 16. Archaeological ProspectionSummaryThe sensor must have:• The spatial resolution to resolve the feature• The spectral resolution to resolve the contrast• The radiometric resolution to identify the change• The temporal sensitivity to record the feature when the contrast is exhibitedThe image must be captured at the right time:• Different features exhibit contrast characteristics at different times
  17. 17. Archaeological ProspectionWhat is the basis for detection
  18. 18. Archaeological Prospection What is the basis for detection Micro-Topographic variations Soil Marks • variation in mineralogy and moisture properties Differential Crop Marks • constraint on root depth and moisture availability changing crop stress/vigour Proxy Thaw Marks • Exploitation of different thermal capacities of objects expressed in the visual component as thaw marksNow you see me dont
  19. 19. Why DART? Isn’t everything rosy in the garden?
  20. 20. Why DART? ‘Things’ are not well understoodEnvironmental processesSensor responses (particularly newsensors)Constraining factors (soil, crops etc.)Bias and spatial variabilityTechniques are scaling!• Geophysics!IMPACTS ON• Deployment• Management
  21. 21. Why DART? Precision agriculture
  22. 22. Why DART? Precision agriculture
  23. 23. Why DART? Traditional AP exemplar
  24. 24. Why DART? Traditional AP exemplarSignificant bias in its application• in the environmental areas where it is productive (for example clay environments tend not to be responsive)• Surveys don’t tend to be systematic• Interpretation tends to be more art than science
  25. 25. What do we do about this?Go back to first principles:• Understand the phenomena• Understand the sensor characteristics• Understand the relationship between the sensor and the phenomena• Understand the processes better• Understand when to apply techniques
  26. 26. What do we do about this? Understand thephenomenaHow does the object generate anobservable contrast to its localmatrix?• Physical• Chemical• Biological• etcAre the contrasts permanent ortransitory?
  27. 27. What do we do about this? Understand thephenomenaIf transitory why are theyoccurring?• Is it changes in? • Soil type • Land management • Soil moisture • Temperature • Nutrient availability • Crop type • Crop growth stage
  28. 28. What do we do about this? Understand therelationship between the sensor and the phenomena
  29. 29. What do we do about this? Understand therelationship between the sensor and the phenomena Spatial Resolution
  30. 30. What do we do about this? Understand therelationship between the sensor and the phenomena Radiometric ResolutionRadiometric resolutiondetermines how finely a system canrepresent or distinguish differences ofintensity
  31. 31. What do we do about this? Understand therelationship between the sensor and the phenomena Temporal Resolution
  32. 32. What do we do about this? Understand therelationship between the sensor and the phenomena Spectral(?) Resolution
  33. 33. What do we do about this? Example from multi orhyper spectral imaging
  34. 34. DART
  35. 35. DART - Collaborators
  36. 36. DART: Ground Observation BenchmarkingBased upon an understanding of:• Nature of the archaeological residues • Nature of archaeological material (physical and chemical structure) • Nature of the surrounding material with which it contrasts • How proxy material (crop) interacts with archaeology and surrounding matrix• Sensor characteristics • Spatial, spectral, radiometric and temporal • How these can be applied to detect contrasts• Environmental characteristics • Complex natural and cultural variables that can change rapidly over time
  37. 37. DART: Ground Observation BenchmarkingTry to understand the periodicity of change• Requires • intensive ground observation • at known sites (and their surroundings) • In different environmental settings • under different environmental conditions
  38. 38. DART: SitesLocation• Diddington, Cambridgeshire• Harnhill, GloucestershireBoth with• contrasting clay and well draining soils• an identifiable archaeological repertoire• under arable cultivationContrasting Macro environmentalcharacteristics
  39. 39. DART: Field MeasurementsSpectro-radiometry• Soil• Vegetation • Every 2 weeksCrop phenology• Height• Growth (tillering)Flash res 64• Including induced events
  40. 40. DART: Field MeasurementsResistivityGround penetrating radarEmbedded Soil Moisture andTemperature probes• Logging every hourWeather station• Logging every half hour
  41. 41. DART: Field MeasurementsAerial data• Hyperspectral surveys • CASI • EAGLE • HAWK• LiDAR• Traditional Aerial PhotographsLow level photography• 1 photo every 30 minutes
  42. 42. DART: Probe Arrays
  43. 43. DART: Probe Arrays
  44. 44. DART: Weather StationDavis Vantage Pro Weather station• Collects a range of technical data e.g. • Wind speed • Wind direction • Rainfall • Temperature • Humidity • Solar Radiation • Barometric Pressure• And derivatives • Wind Chill • Heat Index
  45. 45. DART ERT Ditch Rob Fry B’ham TDR Imco TDR Spectro-radiometry transect
  46. 46. DART ERT Ditch Rob Fry B’ham TDR Imco TDR Spectro-radiometry transect
  47. 47. DART: Laboratory MeasurementsGeotechnical analysesGeochemical analysesPlant Biology
  48. 48. DART: Data so far - Temperature
  49. 49. DART: Data so far - Temperature
  50. 50. DART: Data so far - Temperature
  51. 51. DART: Data so far - Temperature
  52. 52. DART: Data so far - Temperature
  53. 53. DART: Data so far - TemperatureUseful tool for• Scheduling diurnal thermal inertia flights• Calibrating the TDR readings
  54. 54. DART: Data so far - PermittivityTDR - How does it work• Sends a pulse of EM energy• Due to changes in impedance, at the start and at the end of the probe, the pulse is reflected back and the reflections can be identified on the waveform trace• The distance between these two reflection points is used to determine the Dielectric permittivity• Different soils have different dielectric permittivity • This needs calibrating before soil moisture can be derived from the sensors
  55. 55. DART: Data so far - PermittivityKey aims• Investigate the propagation of EM radiation in different soil conditions (e.g. temperature, magnetic permeability, moisture content, density) and identify the difference between archaeological residue and the surrounding soil matrix• Attempt to use geotechnical properties (e.g. particle size distribution, moisture content) to predict the geophysical responses of the different EM sensors used in aerial and geophysical work• Link the soil properties to local weather and other environmental factors to enable better planning for collection techniques
  56. 56. DART: Data so far - Permittivity
  57. 57. DART: Data so far - Permittivity
  58. 58. DART: Data so far - Permittivity
  59. 59. DART: Data so far - PermittivityFurther analysis of permittivity and conductivity against rainfallLinking the changes to the weather patternsComparisons can be made between• Soils at different depths• Archaeological and non-archaeological features• Different soil types at the different locationsConversion to moisture content is also a priority
  60. 60. DART: Data so far – Earth Resistance
  61. 61. DART: Data so far – Earth Resistancemethodology similar to that employed by Parkyn et al. (2011)Overview• data points • lie within the ditch feature • over the non-archaeological feature• find an average data value for the feature and the surrounding soilThe percentage difference between these two figures canthen be considered the amount of contrast within the testarea.The higher the percentage, the better the feature is able to bedefined.
  62. 62. DART: Data so far – Earth Resistance Probe Separation (m) 0.25 0.5 0.75 1June R 18.04742552 18.88545 18.896896 16.79403July 19.13517794 17.15205 17.081613 15.01906August #N/A #N/A #N/A #N/A Difference in magnitudeSeptember 8.841189868 13.255 14.512463 15.53069 Change of Contrast Factors withOctober 7.988128839 10.97714 12.217018 11.6229 20 Seasons Contrast Factor (%) 15 Twin Probe Electrode Seperation (m) 10 0.2 5 0.5 0.7 5 5 Septemb June July August October er 0.25 18.047426 19.135178 8.8411899 7.9881288 0.5 18.885449 17.152047 13.255001 10.977143 0.75 18.896896 17.081613 14.512463 12.217018 1 16.794035 15.019057 15.530692 11.622898
  63. 63. Spectro-radiometry: Methodology• Recorded monthly • Twice monthly at Diddington during the growing season• Transects across linear features• Taken in the field where weather conditions permit• Surface coverage evaluated using near-vertical photography• Vegetation properties recorded along transect • Chlorophyll (SPAD) • Height
  64. 64. Spectro-radiometry: Methodology• Lab-based, background methodology • Soils • Soil samples taken along transect • Reflectance measured with varying moisture content • Vegetation • Vegetation samples taken during each field visit • Measured under artificial light under controlled conditions
  65. 65. Diddington transect 1: Spectroradiometry June 2011 0.12Rel 0.1ativ 0.08er 0.06efle 0.04ctan 0.02ce 0 400 500 600 700 Wavelength (nm) 27/06/2011 Archaeology 27/6/2011 Outside archaeology 14/06/2011 Archaeology 14/06/2011 Outside archaeology 08/06/2011 Archaeology 08/06/2011 Outside archaeology
  66. 66. Diddington transect 1: Spectroradiometry June 2011 0.4R 0.35ela 0.3tiv 0.25er 0.2efl 0.15ect 0.1anc 0.05e 0 350 450 550 650 750 850 950 1050 1150 1250 1350 1450 1550 1650 1750 1850 1950 2050 2150 2250 2350 2450 Wavelength (nm) 27/06/2011 Archaeology 27/6/2011 Outside archaeology 14/06/2011 Archaeology 14/06/2011 Outside archaeology 08/06/2011 Archaeolgy 08/06/2011 Outside archaeology
  67. 67. DART: Plant BiologyLab experiments conducted in collaboration with Leeds PlantBiology in 2011 and repeated in 2012From soils at Quarry FieldSoil structure appears to be the major component influencingroot penetration and plant health
  68. 68. DART: Knowledge Base
  69. 69. DART: Communication
  70. 70. The case for Open Science from Cameron Neylon
  71. 71. Open Data: Server (in the near future)The full project archive will be available from the server Raw Data Processed Data Web ServicesWill also include TDR data Weather data Subsurface temperature data Soil analyses spectro-radiometry transects Crop analyses Excavation data In-situ photos
  72. 72. Open Data: Server (in the near future)Also: Hyperspectral data Thermal imaging Full Waveform LiDAR UAV data collectionFormats Standard interoperable formatsLicences These are not complete Most data will be made available under an open re-use licence (see server) Creative Commons GPL
  73. 73. Why are we doing this – spreading the love
  74. 74. Why are we doing this – it’s the right thing to doDART is a publically funded projectPublically funded data should provide benefit to the public
  75. 75. Why are we doing this – IMPACT/unlocking potentialMore people use the data then there is improved impactBetter financial and intellectual return for the investors
  76. 76. Why are we doing this – innovationReducing barriers to data and knowledge can improveinnovation
  77. 77. Why are we doing this – educationTo provide baseline exemplar data for teaching and learning
  78. 78. Why are we doing this – building our networkFind new ways to exploit our dataDevelop contactsWrite more grant applications
  79. 79. Questions

×