Dr. Armando Guevara argues that traditional monolithic geoimaging sensors are becoming outdated in a similar way that mainframe computers were replaced. Next-generation multi-purpose sensors are more flexible, scalable, and cost-effective. They are built using commercial off-the-shelf components, making them more affordable to purchase and maintain than traditional single-purpose sensors. Collection companies will increasingly adopt multi-purpose sensors as they allow for changing needs and jobs compared to inflexible monolithic sensors. This transition mirrors how mainframe computers were overtaken by more powerful and accessible desktop technologies.
Technical inovation in mechanical fieldKrishna Raj
ALL THE EXAMPLES OF RECENT INVENTION IN MECHANICAL FIELD .
BETTER DISCRIPTION WITH EXAMPLES AND THEIR IMAGES.
BEST EVER PPT OF TECHNICAL INOVATION IN MECHANICAL FIELD TOPIC.
U CAN EXPLORE IT
On the spatial enabling of information paradigm rev 06Armando Guevara
This paper was first written in 1994. At the time a paradigm shift in Information Technology (IT) was beginning to take place as documented in summary by Business Week, on October 25, 1993, in the article by Don Tapscott - “Paradigm Shift: How Information Technology is Reinventing the Enterprise”. Through the years the basic premises established in the original paper have prevailed. We continue in essence to be moved by “The Power of I®” and with that, a new wave of spatially enabled Internet Services (Is-Net.Net®). But there is quite a significant amount of work do be done in the area of spatial semantics and spatial ontologies. We are now on the road towards the automation of cognitive processes for classification and information generation.
Training on management of endemic diseases for pig value chains in UgandaILRI
Presented by Dione, M.M., Pezo, D., Ouma, E.A., Roesel, K., Brandes D. and Kawuma, B. at the 4th International Conference on Sustainable Livelihoods and Health in Africa, Kampala, Uganda, 18-19 June 2015.
Why are we — as a digital agency focusing on user experience — even interested in things like this?
The answer is simple: innovation and curiosity are part of our DNA. Ergosign also sees itself as an active driver of digital
transformation. In order to provide our clients from various industries with meaningful advice, it’s essential to broaden our own horizons. Even if technology doesn’t seem to have anything at all to do with user interfaces in the strictest sense at first glance.
This is the case with Ottobock exoskeletons.
A key moment was the realization that Ergosign and Ottobock are linked by a common mission: human-centered design through the responsible and sustainable use of present and future technology. For a world where technology actively supports people instead of overwhelming them.
Technical inovation in mechanical fieldKrishna Raj
ALL THE EXAMPLES OF RECENT INVENTION IN MECHANICAL FIELD .
BETTER DISCRIPTION WITH EXAMPLES AND THEIR IMAGES.
BEST EVER PPT OF TECHNICAL INOVATION IN MECHANICAL FIELD TOPIC.
U CAN EXPLORE IT
On the spatial enabling of information paradigm rev 06Armando Guevara
This paper was first written in 1994. At the time a paradigm shift in Information Technology (IT) was beginning to take place as documented in summary by Business Week, on October 25, 1993, in the article by Don Tapscott - “Paradigm Shift: How Information Technology is Reinventing the Enterprise”. Through the years the basic premises established in the original paper have prevailed. We continue in essence to be moved by “The Power of I®” and with that, a new wave of spatially enabled Internet Services (Is-Net.Net®). But there is quite a significant amount of work do be done in the area of spatial semantics and spatial ontologies. We are now on the road towards the automation of cognitive processes for classification and information generation.
Training on management of endemic diseases for pig value chains in UgandaILRI
Presented by Dione, M.M., Pezo, D., Ouma, E.A., Roesel, K., Brandes D. and Kawuma, B. at the 4th International Conference on Sustainable Livelihoods and Health in Africa, Kampala, Uganda, 18-19 June 2015.
Why are we — as a digital agency focusing on user experience — even interested in things like this?
The answer is simple: innovation and curiosity are part of our DNA. Ergosign also sees itself as an active driver of digital
transformation. In order to provide our clients from various industries with meaningful advice, it’s essential to broaden our own horizons. Even if technology doesn’t seem to have anything at all to do with user interfaces in the strictest sense at first glance.
This is the case with Ottobock exoskeletons.
A key moment was the realization that Ergosign and Ottobock are linked by a common mission: human-centered design through the responsible and sustainable use of present and future technology. For a world where technology actively supports people instead of overwhelming them.
Automation's Perfect Storm! These Changes Aren't Coming, They're Here!Walt Boyes
Process manufacturing is facing a perfect storm of changes: Millenials, Mobile Devices, the Internet of Things, Big Data and Complex Systems Analysis, new, smarter sensors, smarter more agile control systems...the use of apps like Legos...and it is all going to hit at once over the next five years. These are the trends that will change our lives.
My speech to the Hong Kong IoT Association about how instantly shared real-time IoT data can transform companies and allow highly efficient and creative circular organizations
11 Design Strategies Of The Next Decadedesignsojourn
How will we design differently in the next decade? Join the conversation at http://www.designsojourn.com/11-design-strategies-of-the-next-decade and stand a chance to win a HP Mini 5101!
This presentation is a quick overview of the results from a workshop about how people move/interact in the city of Torino. It was discussed in a panel with Bruce Sterling and Geoff Manaugh at the "i realize conference".
HPE’s Erik Vogel on Key Factors for Driving Success in Hybrid Cloud Adoption ...Dana Gardner
A discussion on innovation around maturing hybrid cloud models and how proper common management of hybrid cloud operations makes or breaks the expected benefits.
Digital fabrication devices (such as 3-D printers) are doing to manufacturing what the Internet has done to information-based goods and services. For example, a 3-D printer generated a bust of Beethoven in less than two hours, using a design uploaded to Thingiverse.com by a contributor identified only as “dino-girl.” Here are the changes to consider before this innovation takes hold.
How Containers are Becoming The New Basic Currency For Pay as You Go Hybrid ITDana Gardner
A discussion on how IT operators are now looking to increased automation, orchestration, and compatibility benefits to further exploit containers as a mainstay across their next-generation hybrid IT estate.
Towards an adaptable spatial processing architectureArmando Guevara
An Adaptable Spatial Processing Architecture (ASPA) is what is needed to meet the demands of both multidisciplinary and specialized applications. ASPA fundamentals are based on a GFM that has a set of functional (GISP) primitives clearly defined that allows the automatic construction of a SOM. ASPA has to be designed based on the six continuity criterions given above. In this respect, ASPA would be an expert monitor based on a high level language consisting of spatial operators that have definable hierarchical constructs. These spatial operators can be organized following a programmable schema that would allow them to generate the SOM. ASPA would work in conjunction with a data base management system (DBMS). The DBMS would respond to both spatial and non-spatial operators. The heart of ASPA and the DBMS would be a GFM.
Automation's Perfect Storm! These Changes Aren't Coming, They're Here!Walt Boyes
Process manufacturing is facing a perfect storm of changes: Millenials, Mobile Devices, the Internet of Things, Big Data and Complex Systems Analysis, new, smarter sensors, smarter more agile control systems...the use of apps like Legos...and it is all going to hit at once over the next five years. These are the trends that will change our lives.
My speech to the Hong Kong IoT Association about how instantly shared real-time IoT data can transform companies and allow highly efficient and creative circular organizations
11 Design Strategies Of The Next Decadedesignsojourn
How will we design differently in the next decade? Join the conversation at http://www.designsojourn.com/11-design-strategies-of-the-next-decade and stand a chance to win a HP Mini 5101!
This presentation is a quick overview of the results from a workshop about how people move/interact in the city of Torino. It was discussed in a panel with Bruce Sterling and Geoff Manaugh at the "i realize conference".
HPE’s Erik Vogel on Key Factors for Driving Success in Hybrid Cloud Adoption ...Dana Gardner
A discussion on innovation around maturing hybrid cloud models and how proper common management of hybrid cloud operations makes or breaks the expected benefits.
Digital fabrication devices (such as 3-D printers) are doing to manufacturing what the Internet has done to information-based goods and services. For example, a 3-D printer generated a bust of Beethoven in less than two hours, using a design uploaded to Thingiverse.com by a contributor identified only as “dino-girl.” Here are the changes to consider before this innovation takes hold.
How Containers are Becoming The New Basic Currency For Pay as You Go Hybrid ITDana Gardner
A discussion on how IT operators are now looking to increased automation, orchestration, and compatibility benefits to further exploit containers as a mainstay across their next-generation hybrid IT estate.
Similar to A new generation of geoimaging sensors is poised to change the geoimaging industry armando guevara (20)
Towards an adaptable spatial processing architectureArmando Guevara
An Adaptable Spatial Processing Architecture (ASPA) is what is needed to meet the demands of both multidisciplinary and specialized applications. ASPA fundamentals are based on a GFM that has a set of functional (GISP) primitives clearly defined that allows the automatic construction of a SOM. ASPA has to be designed based on the six continuity criterions given above. In this respect, ASPA would be an expert monitor based on a high level language consisting of spatial operators that have definable hierarchical constructs. These spatial operators can be organized following a programmable schema that would allow them to generate the SOM. ASPA would work in conjunction with a data base management system (DBMS). The DBMS would respond to both spatial and non-spatial operators. The heart of ASPA and the DBMS would be a GFM.
iOne STKA Technology Innovation Forum - The Science of Where 5.16.13Armando Guevara
• “With the iOne family of modular and scalable sensors, Visual Intelligence has set out to build and deliver the most economical and best performing oblique / 3D geoimaging system. Our partnership with Pix4D is aimed to providing our customers the most streamlined and efficient image oblique processing, modeling and product generation. Further, since our iOne sensors are scalable to the miniaturization level, we share with Pix4D an open systems approach whereby the same software workflow can be applied from the large systems all the way down to the UAV and miniaturized (mobile) systems.”
JMeter webinar - integration with InfluxDB and GrafanaRTTS
Watch this recorded webinar about real-time monitoring of application performance. See how to integrate Apache JMeter, the open-source leader in performance testing, with InfluxDB, the open-source time-series database, and Grafana, the open-source analytics and visualization application.
In this webinar, we will review the benefits of leveraging InfluxDB and Grafana when executing load tests and demonstrate how these tools are used to visualize performance metrics.
Length: 30 minutes
Session Overview
-------------------------------------------
During this webinar, we will cover the following topics while demonstrating the integrations of JMeter, InfluxDB and Grafana:
- What out-of-the-box solutions are available for real-time monitoring JMeter tests?
- What are the benefits of integrating InfluxDB and Grafana into the load testing stack?
- Which features are provided by Grafana?
- Demonstration of InfluxDB and Grafana using a practice web application
To view the webinar recording, go to:
https://www.rttsweb.com/jmeter-integration-webinar
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
UiPath Test Automation using UiPath Test Suite series, part 4DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 4. In this session, we will cover Test Manager overview along with SAP heatmap.
The UiPath Test Manager overview with SAP heatmap webinar offers a concise yet comprehensive exploration of the role of a Test Manager within SAP environments, coupled with the utilization of heatmaps for effective testing strategies.
Participants will gain insights into the responsibilities, challenges, and best practices associated with test management in SAP projects. Additionally, the webinar delves into the significance of heatmaps as a visual aid for identifying testing priorities, areas of risk, and resource allocation within SAP landscapes. Through this session, attendees can expect to enhance their understanding of test management principles while learning practical approaches to optimize testing processes in SAP environments using heatmap visualization techniques
What will you get from this session?
1. Insights into SAP testing best practices
2. Heatmap utilization for testing
3. Optimization of testing processes
4. Demo
Topics covered:
Execution from the test manager
Orchestrator execution result
Defect reporting
SAP heatmap example with demo
Speaker:
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
DevOps and Testing slides at DASA ConnectKari Kakkonen
My and Rik Marselis slides at 30.5.2024 DASA Connect conference. We discuss about what is testing, then what is agile testing and finally what is Testing in DevOps. Finally we had lovely workshop with the participants trying to find out different ways to think about quality and testing in different parts of the DevOps infinity loop.
GraphRAG is All You need? LLM & Knowledge GraphGuy Korland
Guy Korland, CEO and Co-founder of FalkorDB, will review two articles on the integration of language models with knowledge graphs.
1. Unifying Large Language Models and Knowledge Graphs: A Roadmap.
https://arxiv.org/abs/2306.08302
2. Microsoft Research's GraphRAG paper and a review paper on various uses of knowledge graphs:
https://www.microsoft.com/en-us/research/blog/graphrag-unlocking-llm-discovery-on-narrative-private-data/
"Impact of front-end architecture on development cost", Viktor TurskyiFwdays
I have heard many times that architecture is not important for the front-end. Also, many times I have seen how developers implement features on the front-end just following the standard rules for a framework and think that this is enough to successfully launch the project, and then the project fails. How to prevent this and what approach to choose? I have launched dozens of complex projects and during the talk we will analyze which approaches have worked for me and which have not.
Dev Dives: Train smarter, not harder – active learning and UiPath LLMs for do...UiPathCommunity
💥 Speed, accuracy, and scaling – discover the superpowers of GenAI in action with UiPath Document Understanding and Communications Mining™:
See how to accelerate model training and optimize model performance with active learning
Learn about the latest enhancements to out-of-the-box document processing – with little to no training required
Get an exclusive demo of the new family of UiPath LLMs – GenAI models specialized for processing different types of documents and messages
This is a hands-on session specifically designed for automation developers and AI enthusiasts seeking to enhance their knowledge in leveraging the latest intelligent document processing capabilities offered by UiPath.
Speakers:
👨🏫 Andras Palfi, Senior Product Manager, UiPath
👩🏫 Lenka Dulovicova, Product Program Manager, UiPath
Neuro-symbolic is not enough, we need neuro-*semantic*Frank van Harmelen
Neuro-symbolic (NeSy) AI is on the rise. However, simply machine learning on just any symbolic structure is not sufficient to really harvest the gains of NeSy. These will only be gained when the symbolic structures have an actual semantics. I give an operational definition of semantics as “predictable inference”.
All of this illustrated with link prediction over knowledge graphs, but the argument is general.
Accelerate your Kubernetes clusters with Varnish CachingThijs Feryn
A presentation about the usage and availability of Varnish on Kubernetes. This talk explores the capabilities of Varnish caching and shows how to use the Varnish Helm chart to deploy it to Kubernetes.
This presentation was delivered at K8SUG Singapore. See https://feryn.eu/presentations/accelerate-your-kubernetes-clusters-with-varnish-caching-k8sug-singapore-28-2024 for more details.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...
A new generation of geoimaging sensors is poised to change the geoimaging industry armando guevara
1. “Going the Way of the Mainframe: A New Generation of Geoimaging Sensors Is Poised to Change the
Geoimaging Industry”
By Dr. Armando Guevara
When I say the word “mainframe”, what comes to mind? If you are contemporary to my college days, you
probably envision a hulking computing system that filled an entire room, serving as the beating heart of
IT operations for organizations back in the day before “IT” was even a term. If you are younger than that,
you might even ask, “What is a mainframe?” which in many ways is a perfect illustration of one of the
main themes of this article.
When I think back to those hulking machines and say the word “mainframe”, I do so with some fondness
in my voice because I’ve been around long enough to remember how they represented a huge leap
forward for the time. They also remind me of my free-spirited college days, which definitely adds to the
nostalgia. Mainframe computers were significant because they democratized computing power by
allowing companies of any size to take advantage of the computing power that was previously available
to only a very few of the most deep-pocketed commercial businesses and government agencies. They
allowed any organization to become part of the computer age, assuming you had a basement big enough
and an electricity budget large enough to support it.
Mainframes were powerful for their time, but no one from that era would argue with the reality that they
were expensive to buy, expensive to maintain and fix, bulky (to the nth degree), inflexible and resistant to
the emerging computing needs of organizations over time. They had all of those shortcomings, but
nonetheless those of us who started our careers in a different era have strong feelings of fondness for
them. They had a long run of technological supremacy and then they had to face what I call the 3-key
transforming paradigm shift vectors/forces of our time (“the 3 forces”): 1) data and all devices became
digital, 2) devices became increasingly smaller, and 3) devices became increasingly faster. Those three
forces converged so rapidly that within 10 years “mainframe power” was put in the palm of everyone’s
hands (along with a lot of other functionality, including the ability to make calls, take pictures, capture
video, listen to the radio, watch TV, get GPS coordinates and more—all spatially-enabled in a single
device). So with few exceptions, mainframe computers yielded to the 3 forces and made way for more
powerful, less expensive, more flexible technology descendants: mini computers, work stations, then PCs
and now handhelds1.
How does that relate to sensor technology? Sensors are undergoing a transformation that is remarkably
similar to what happened to mainframes. Monolithic, single-purpose sensors are in many ways the
mainframes of the geospatial geoimaging industry. They have been around forever (well, 10 years these
days is “forever”) and many of us have become attached to them, just like we did with the mainframes we
worked with back at the beginning of our careers. But a new generation of sensors is emerging and poised
to replace those single-purpose sensors just like the work station and PC did to the mainframe years and
years ago.
For too long of a time in my mind, manufacturers have focused on selling single-purpose, monolithic,
mission-specific sensors (from EO medium <17 kps to large > 17 kps frame formats). Those sensors were
very good at what they were designed to do. Clearly that is true, otherwise there is no way they would
have lasted over a decade as they have. But those strengths came with weaknesses that geospatial
collection companies have had to patiently cope with in order to do their day-to-day jobs: lack of
flexibility, limited scalability, non-standards-based architecture, interoperability challenges, high up-front
1
Google: Guevara 1994, The Spatial Enabling of Information
2. costs, expensive maintenance costs and other downsides. Yes, there is a nostalgic beauty to singlepurpose sensors that are big hunks of elegantly designed metal built to do a specific job. I have an
undeniable admiration for them and their pioneering manufacturers, but I feel strongly that the
technological advancements, higher performance and dramatically lower cost of the next generation of
sensors will do to monolithic, traditional sensors, what PCs did to the mainframe.
Instead of using a single-purpose, monolithic design, these next-generation sensors are designed to be
multi-purpose, functionally-flexible and far more cost-effective. They are smaller, faster, and easier-towork with (naturally aligned with the 3 forces). They are also more adaptable to changing job
requirements, which is increasingly important to for collection companies who often have dramatically
different jobs during a single day using the same aircraft. Cost is also another big difference between the
traditional sensors and the new generation of solutions. Traditional sensors have high up-front costs and
are expensive to maintain because they do not have standards-based architectures and are difficult to
service in the field. In contrast, next-generation sensors are being built using standards and COTS
components that give them lower up-front costs and simpler, less expensive maintenance requirements
after they are put to work.
When I talk to peers about multi-purpose sensors, the topic of performance always comes up. Many folks
make the assumption that single-purpose sensors must have better accuracy or performance since they
only focus on one thing. Also they think that because they are economical they have less performance or
are of inferior quality. As my teenage son would say, the response to all of those assumptions is: “Not!”
The truth is that multi-purpose sensors are at least as good as traditional sensors, and aim to become far
better in terms of precision, collection capabilities and other key performance metrics. That removes the
biggest potential objection to next-generation sensors, which then allows people to focus on the topic of
cost and ease of use.
Old-style monolithic sensors are typically built with proprietary architectures that make them expensive
to buy and very costly to maintain and repair. A typical large area collection (more than 17 kps – kilo
pixel swath) EO geoimaging monolithic system can cost upwards of $1 million dollars and when the
system is on the fritz, it creates sizable opportunity costs as their airplanes sit idle for days as the unit is
shipped off for repair. In contrast, multi-purpose sensors are built on open architectures that use
commercial off-the-shelf components. That makes their up-front cost a fraction of what monolithic
systems cost, and maintenance and repairs are dramatically less time-consuming and costly since
components can be easily swapped out to complete repairs rather than sending off a unit to be repaired
off-site.
Another attribute that gives next-generation, multi-purpose sensors a sizable advantage over monolithic
ones is scalability, both in terms of collection (from medium to largest) and functionality. Single-purpose
sensors are highly inflexible by their very nature. You get everything in one box, and because of that, you
pay for features you may not need, at least not when first bought. These “mainframe-like” sensors are
designed to do a single thing, and they aim to do that one job the best they can. But collection companies
today have to be flexible in adapting to the specs of each job they are hired to do. In a given day, they
may do multiple jobs that require very different types of imagery. This diversity of jobs is a direct result
of the growth of our industry.
As more industries learn about how to take advantage of geospatial imagery and embark to generate
revenue in applying the Science of Where, the variety of jobs for a collection company has grown in ways
that were unthinkable just a few years ago. They need sensors that can adapt from job to job and hour to
hour, and monolithic sensors are incapable of doing so. To increase the collection capability of one of
those traditional sensors, a collection company would need to buy a whole new sensor. And to add a
3. capability like Oblique/3D, multi-spectral or thermal et.al it would need to go out and buy or lease a
specialized sensor just for that job. That just doesn’t make sense today.
In contrast, next-generation multi-purpose sensors have scalable collection capabilities, allowing
collection companies to compete for larger project by increasing the size of their collection swath (“kps”).
Monolithic sensors have fixed kps, but multi-purpose sensors can scale up as needed on the fly without a
lot of hullabaloo to do so. They can do it in the field like flipping a switch in between jobs.
Functional scalability is also a huge advantage of multi-purpose sensors, allowing collection companies to
add infrequently-used collection capabilities on the fly if a customer needs them. As an example, a
collection company that is hired to do an agricultural imaging job once a quarter is often unable to do it
with their existing monolithic, single-purpose sensors. They need to go out and lease multi-spectral
sensors at great cost and effort just for that job, and they may not need them again for several months.
Multi-purpose sensors can add functionality on the fly, allowing a collection company to perform that
specialize job with their existing system.
The next big wave in our industry will be collection via new devices, such as unmanned vehicle systems
(UVSs) and smart-handheld “geospatial gadgets”, and multi-purpose sensors are ideally suited to support
those applications because these sensors can be miniaturized and their open platform makes it simple to
map to the software of UVSs and mobile devices. Old-style single-purpose sensors have an architecture
that is not compatible with these new applications, and I believe it will be a major driver to migration
away from monolithic sensors in the next few years in addition to the reasons I have outlined above.
Traditional sensors still have their strengths, and I believe they may continue to have a welcome home in
niche applications for a bit longer, just like some mainframe computers still continue to be used today.
But the cost pressure alone will be a huge factor in driving adoption of multi-purpose sensors. Collection
companies (especially outside the U.S. and Europe) typically do not have $1+ million dollars lying
around to spend on sensors platforms and costly “mainframe-like” IT processing environments—
particularly when multi-purpose sensors trend to cost at least 50% less while offering better performance,
flexibility and scalability.
Geospatial collection companies have important decisions to make about which type of sensor technology
is best performing for them and will meet their needs today, tomorrow and well into the future. The
solutions they select must be able to reduce costs, increase ROI, be multipurpose and reconfigurable as
needed in the field—and very importantly—be as digital-obsolescence-resilient as possible to extend
operation life and ensure a return on the investment.
About the Author:
Dr. Armando Guevara is the President and CEO of Visual Intelligence (www.visualintell.com), a
company that provides geoimaging solutions for airborne, terrestrial and mobile applications including
the iOne family of sensors and the iOne STKA, which won the 2013 Technology Innovation in Sensors
Award from the Geospatial Forum.