REonomy's mission is to revolutionize the commercial real estate industry by providing professionals easy to access and actionable investment data and analytics. Their solution aggregates data from various sources and cross references, unifies, and cleanses the data. This allows users to perform trend, market, and correlation analysis through a self-service reporting and analytics interface. The target markets include commercial brokers, institutional investors, mortgage bankers, and others spending billions annually on commercial real estate data and analytics currently available through antiquated and costly providers.
This document proposes developing an interactive street directory platform that integrates existing city council mapping data with geo-tagged aerial images, street views, and property data. The platform would allow users to search for and view information about properties and businesses, including photos, locations, details and ratings. It would provide value to city councils by helping them better manage assets and communicate with residents. The project aims to partner with 151 Malaysian city councils to offer the platform for a minimal annual fee while also creating new opportunities for local businesses and public participation.
The document discusses how personalization and dynamic content are becoming increasingly important on websites. It notes that 52% of marketers see content personalization as critical and 75% of consumers like it when brands personalize their content. However, personalization can create issues for search engine optimization as dynamic URLs and content are more difficult for search engines to index than static pages. The document provides tips for SEOs to help address these personalization and SEO challenges, such as using static URLs when possible and submitting accurate sitemaps.
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldabaux singapore
How can we take UX and Data Storytelling out of the tech context and use them to change the way government behaves?
Showcasing the truth is the highest goal of data storytelling. Because the design of a chart can affect the interpretation of data in a major way, one must wield visual tools with care and deliberation. Using quantitative facts to evoke an emotional response is best achieved with the combination of UX and data storytelling.
This document summarizes a study of CEO succession events among the largest 100 U.S. corporations between 2005-2015. The study analyzed executives who were passed over for the CEO role ("succession losers") and their subsequent careers. It found that 74% of passed over executives left their companies, with 30% eventually becoming CEOs elsewhere. However, companies led by succession losers saw average stock price declines of 13% over 3 years, compared to gains for companies whose CEO selections remained unchanged. The findings suggest that boards generally identify the most qualified CEO candidates, though differences between internal and external hires complicate comparisons.
The document discusses how networks and applications can become more aware of each other to improve the experience for end users. Currently, networks and applications operate independently without much visibility into each other. The document proposes that applications share information about end users and traffic with networks, and networks share information about topology, bandwidth, and resources with applications. This would allow applications to optimize content placement and resource usage, and networks to gain insights to better optimize traffic and provide new services. The document argues this type of programmable network can improve areas like security, performance, analytics and more.
This document proposes developing an interactive street directory platform that integrates existing city council mapping data with geo-tagged aerial images, street views, and property data. The platform would allow users to search for and view information about properties and businesses, including photos, locations, details and ratings. It would provide value to city councils by helping them better manage assets and communicate with residents. The project aims to partner with 151 Malaysian city councils to offer the platform for a minimal annual fee while also creating new opportunities for local businesses and public participation.
The document discusses how personalization and dynamic content are becoming increasingly important on websites. It notes that 52% of marketers see content personalization as critical and 75% of consumers like it when brands personalize their content. However, personalization can create issues for search engine optimization as dynamic URLs and content are more difficult for search engines to index than static pages. The document provides tips for SEOs to help address these personalization and SEO challenges, such as using static URLs when possible and submitting accurate sitemaps.
Lightning Talk #9: How UX and Data Storytelling Can Shape Policy by Mika Aldabaux singapore
How can we take UX and Data Storytelling out of the tech context and use them to change the way government behaves?
Showcasing the truth is the highest goal of data storytelling. Because the design of a chart can affect the interpretation of data in a major way, one must wield visual tools with care and deliberation. Using quantitative facts to evoke an emotional response is best achieved with the combination of UX and data storytelling.
This document summarizes a study of CEO succession events among the largest 100 U.S. corporations between 2005-2015. The study analyzed executives who were passed over for the CEO role ("succession losers") and their subsequent careers. It found that 74% of passed over executives left their companies, with 30% eventually becoming CEOs elsewhere. However, companies led by succession losers saw average stock price declines of 13% over 3 years, compared to gains for companies whose CEO selections remained unchanged. The findings suggest that boards generally identify the most qualified CEO candidates, though differences between internal and external hires complicate comparisons.
The document discusses how networks and applications can become more aware of each other to improve the experience for end users. Currently, networks and applications operate independently without much visibility into each other. The document proposes that applications share information about end users and traffic with networks, and networks share information about topology, bandwidth, and resources with applications. This would allow applications to optimize content placement and resource usage, and networks to gain insights to better optimize traffic and provide new services. The document argues this type of programmable network can improve areas like security, performance, analytics and more.
CoStar Group is the largest provider of commercial real estate data and analytics. It collects data on over 4 million properties totaling over 81 billion square feet across various sectors like office, industrial, retail, and apartment. CoStar provides comprehensive third-party data, reports, and tools to help users identify available properties, research markets, compare sales, evaluate leasing opportunities, and gain insights. Its suite of online products and mobile app CoStarGo deliver property listings, leasing and sales comps, tenant information, and analytics to help users close more deals.
Finding fraud in large, diverse data setsChris Selland
The document discusses how big data analytics can be used for fraud detection and prevention. It notes that fraud costs businesses and governments billions annually. Modern technology allows analyzing large transaction data sets to detect patterns and anomalies indicative of fraud. The HP Vertica platform is presented as a solution for businesses to more effectively analyze transaction data in real-time to stay ahead of constantly changing fraud patterns. Case studies of using Vertica to detect healthcare and credit card fraud are also mentioned.
Big data refers to the massive amounts of information created every day from various sources. Some key facts about big data include:
- Every two days now we create as much data as we did from the beginning of civilization until 2003.
- Technologies to handle big data must be able to process petabytes and exabytes of data from a variety of structured and unstructured sources in real-time.
- Analyzing big data can provide valuable insights into areas like smart cities, healthcare, retail and manufacturing by improving operations and decision making.
However, big data also presents challenges around its massive scale, rapid growth, heterogeneity and real-time processing requirements that differ from traditional data warehousing.
The document provides an overview of intellectual property (IP) valuation. It discusses why IP valuation is important for various contexts such as transactions, taxation, and disputes. It also covers different types of IP assets and valuation methodologies. The document uses case studies to illustrate how the context can impact the valuation approach and the importance of identifying all relevant IP assets. It emphasizes that IP valuation is both an art and a science that requires considering multiple factors.
Intel Cloud summit: Big Data by Nick KnupfferIntelAPAC
1. Big data is growing rapidly in terms of volume, velocity, and variety.
2. Intel is well positioned to help organizations address big data challenges through its software stack, platforms, and by investing in new technologies.
3. Intel is committed to fostering the growth of the big data ecosystem through broad collaboration with partners.
Vectorwise is an extremely fast database that enables quick decision making through real-time analytics. It is multiple times faster than other databases, with some customer queries seeing speed increases of 70x. This speed is due to its innovative vector processing approach. Customers report being able to reduce BI project timelines by 50% using Vectorwise due to its ease of use and lack of need for tuning. It also reduces infrastructure costs through requiring less hardware and IT resources.
Klikkanan Studio provides a real estate portal for the Indonesian market. The portal allows users to list properties for sale or rent. It offers search functions, detailed property pages, free member registration, paid member accounts to manage listings, and an admin module. The site aims to tap into Indonesia's large internet and mobile user base to help users effectively sell and find properties online through multimedia listings and targeted searches.
Service Oriented Architecture (SOA) [1/5] : Introduction to SOAIMC Institute
This document provides an introduction to service-oriented architecture (SOA). It discusses the evolution of enterprise application frameworks from single-tier mainframe systems to modern n-tier architectures with application servers. SOA is defined as using loosely coupled, reusable services to support business processes and goals. The key benefits of SOA include increased flexibility, reuse, and alignment between IT systems and business processes. The document outlines the typical layers of an SOA implementation including resources, services, and business processes. Key SOA concepts like service registries, coarse-grained services, and service composition are also explained.
Realtors Property Resource (RPR) is a new online database that will provide detailed information on every parcel of property in the United States exclusively for REALTORS. It will merge MLS and public records data to give brokers and agents valuable tools to better serve their clients. RPR will allow nationwide searches of property data as well as market comparisons. The database is free for REALTORS and 100% owned by the National Association of REALTORS to benefit members.
Splunk is a big data company founded in 2004 that provides a platform for collecting, indexing, and analyzing machine-generated data. It has over 5,000 customers in over 80 countries across various industries. Splunk's software can handle large volumes of machine data, scaling to terabytes per day and thousands of users. It collects and indexes machine data from various sources like logs, metrics, and applications without needing prior knowledge of schemas or custom connectors.
The presentation discusses master data management and reference data. It covers defining key data, assessing the impact of MDM, creating a common data quality vision, and the importance of an enterprise data model. Specific topics include the data architecture, mapping vendor data to standard definitions, how MDM provides a single customer view, the role of the customer master index, and how MDM supports both CRM and BI applications.
This document discusses EDGAR Online's role in promoting transparency through structured data. It notes that EDGAR Online has the largest and fastest XBRL dataset with over 11 years of US public company filings. It also has the most comprehensive set of XBRL products and services, including tools for companies to create and file XBRL documents and for regulators and analysts to analyze XBRL data. The document argues that data standards like XBRL are important to improve the outdated financial information supply chain and enable better data analysis.
Delivering next generation enterprise no sql database technologymarcmcneill
This document discusses MarkLogic's next generation NoSQL database technology and its advantages. It provides examples of how various organizations have used MarkLogic to gain insights from big data, create new revenue streams, and deliver on promises of open government and access to information. MarkLogic allows flexible ingestion of structured and unstructured data at massive scale with fast search and analytics capabilities.
Identity Insights: Social, Local and Mobile IdentityJon Bultmeyer
The document discusses key trends in identity management including the shift of power to users and the rise of mobile and social identities. It also summarizes NetIQ's vision to evolve identity management products from provisioning and directories towards identity intelligence, analytics, and services. NetIQ believes identity management must adapt to changing technologies and power structures by contextualizing user identities across multiple data sources.
This document discusses big data and analytics, including how much data is being generated, what is driving this disruption, and who the major players are. It notes issues with current analytics approaches being slow and expensive. The document introduces OpTier's approach of establishing real-time business context across transactions to more quickly gain insights. Potential use cases for financial services are also outlined, such as fraud prevention, customer behavior analysis, and understanding the impact of IT performance on business outcomes.
This document discusses big data and analytics, including how much data is being generated, what is driving this disruption, and who the major players are. It notes issues with current analytics approaches being slow and expensive. The document introduces OpTier's approach of establishing real-time business context across transactions to more quickly gain insights. Potential use cases for financial services are also outlined, such as fraud prevention, customer behavior analysis, and understanding the impact of IT performance on business outcomes.
This document discusses big data and analytics. It notes that digital data is growing exponentially and will reach 35 zettabytes by 2020, with 80% coming from enterprise systems. Big data is being driven by increased transaction data, interaction data from mobile and social media, and improved processing capabilities. Major players in big data include Google, Amazon, IBM and Microsoft. Traditional analytics struggle due to batch processing and lack of business context. The document introduces OpTier's approach of capturing real-time business context across interactions to enable insights with low costs and flexibility. Potential use cases for financial services are discussed.
Evolving analytics at ebay - 2012 Tableau Customer Conferencegdougan1
From Data to Knowledge: Evolving Analytics at ebay.
Gary Dougan's presentation at TCC 2012 (http://www.linkedin.com/in/garydougan)
Learn about eBay’s extensive analytics environment, and how eBay’s Business Intelligence platform team is enabling “visual analytics” across a complex ecosystem of platforms, technologies, and data enthusiasts, to synthesize information and derive insights from dynamic and complex data.
The document discusses OSCRE (Open Standards Consortium for Real Estate) Americas Inc., a non-profit organization that develops data standards to improve interoperability in the real estate industry. OSCRE relies on member contributions to support standard development activities. The document outlines how OSCRE standards can help reduce costs, improve efficiency, and enable real-time data sharing across the fragmented real estate industry. It provides examples of potential cost savings from implementing OSCRE standards and encourages additional industry participation.
CoStar Group is the largest provider of commercial real estate data and analytics. It collects data on over 4 million properties totaling over 81 billion square feet across various sectors like office, industrial, retail, and apartment. CoStar provides comprehensive third-party data, reports, and tools to help users identify available properties, research markets, compare sales, evaluate leasing opportunities, and gain insights. Its suite of online products and mobile app CoStarGo deliver property listings, leasing and sales comps, tenant information, and analytics to help users close more deals.
Finding fraud in large, diverse data setsChris Selland
The document discusses how big data analytics can be used for fraud detection and prevention. It notes that fraud costs businesses and governments billions annually. Modern technology allows analyzing large transaction data sets to detect patterns and anomalies indicative of fraud. The HP Vertica platform is presented as a solution for businesses to more effectively analyze transaction data in real-time to stay ahead of constantly changing fraud patterns. Case studies of using Vertica to detect healthcare and credit card fraud are also mentioned.
Big data refers to the massive amounts of information created every day from various sources. Some key facts about big data include:
- Every two days now we create as much data as we did from the beginning of civilization until 2003.
- Technologies to handle big data must be able to process petabytes and exabytes of data from a variety of structured and unstructured sources in real-time.
- Analyzing big data can provide valuable insights into areas like smart cities, healthcare, retail and manufacturing by improving operations and decision making.
However, big data also presents challenges around its massive scale, rapid growth, heterogeneity and real-time processing requirements that differ from traditional data warehousing.
The document provides an overview of intellectual property (IP) valuation. It discusses why IP valuation is important for various contexts such as transactions, taxation, and disputes. It also covers different types of IP assets and valuation methodologies. The document uses case studies to illustrate how the context can impact the valuation approach and the importance of identifying all relevant IP assets. It emphasizes that IP valuation is both an art and a science that requires considering multiple factors.
Intel Cloud summit: Big Data by Nick KnupfferIntelAPAC
1. Big data is growing rapidly in terms of volume, velocity, and variety.
2. Intel is well positioned to help organizations address big data challenges through its software stack, platforms, and by investing in new technologies.
3. Intel is committed to fostering the growth of the big data ecosystem through broad collaboration with partners.
Vectorwise is an extremely fast database that enables quick decision making through real-time analytics. It is multiple times faster than other databases, with some customer queries seeing speed increases of 70x. This speed is due to its innovative vector processing approach. Customers report being able to reduce BI project timelines by 50% using Vectorwise due to its ease of use and lack of need for tuning. It also reduces infrastructure costs through requiring less hardware and IT resources.
Klikkanan Studio provides a real estate portal for the Indonesian market. The portal allows users to list properties for sale or rent. It offers search functions, detailed property pages, free member registration, paid member accounts to manage listings, and an admin module. The site aims to tap into Indonesia's large internet and mobile user base to help users effectively sell and find properties online through multimedia listings and targeted searches.
Service Oriented Architecture (SOA) [1/5] : Introduction to SOAIMC Institute
This document provides an introduction to service-oriented architecture (SOA). It discusses the evolution of enterprise application frameworks from single-tier mainframe systems to modern n-tier architectures with application servers. SOA is defined as using loosely coupled, reusable services to support business processes and goals. The key benefits of SOA include increased flexibility, reuse, and alignment between IT systems and business processes. The document outlines the typical layers of an SOA implementation including resources, services, and business processes. Key SOA concepts like service registries, coarse-grained services, and service composition are also explained.
Realtors Property Resource (RPR) is a new online database that will provide detailed information on every parcel of property in the United States exclusively for REALTORS. It will merge MLS and public records data to give brokers and agents valuable tools to better serve their clients. RPR will allow nationwide searches of property data as well as market comparisons. The database is free for REALTORS and 100% owned by the National Association of REALTORS to benefit members.
Splunk is a big data company founded in 2004 that provides a platform for collecting, indexing, and analyzing machine-generated data. It has over 5,000 customers in over 80 countries across various industries. Splunk's software can handle large volumes of machine data, scaling to terabytes per day and thousands of users. It collects and indexes machine data from various sources like logs, metrics, and applications without needing prior knowledge of schemas or custom connectors.
The presentation discusses master data management and reference data. It covers defining key data, assessing the impact of MDM, creating a common data quality vision, and the importance of an enterprise data model. Specific topics include the data architecture, mapping vendor data to standard definitions, how MDM provides a single customer view, the role of the customer master index, and how MDM supports both CRM and BI applications.
This document discusses EDGAR Online's role in promoting transparency through structured data. It notes that EDGAR Online has the largest and fastest XBRL dataset with over 11 years of US public company filings. It also has the most comprehensive set of XBRL products and services, including tools for companies to create and file XBRL documents and for regulators and analysts to analyze XBRL data. The document argues that data standards like XBRL are important to improve the outdated financial information supply chain and enable better data analysis.
Delivering next generation enterprise no sql database technologymarcmcneill
This document discusses MarkLogic's next generation NoSQL database technology and its advantages. It provides examples of how various organizations have used MarkLogic to gain insights from big data, create new revenue streams, and deliver on promises of open government and access to information. MarkLogic allows flexible ingestion of structured and unstructured data at massive scale with fast search and analytics capabilities.
Identity Insights: Social, Local and Mobile IdentityJon Bultmeyer
The document discusses key trends in identity management including the shift of power to users and the rise of mobile and social identities. It also summarizes NetIQ's vision to evolve identity management products from provisioning and directories towards identity intelligence, analytics, and services. NetIQ believes identity management must adapt to changing technologies and power structures by contextualizing user identities across multiple data sources.
This document discusses big data and analytics, including how much data is being generated, what is driving this disruption, and who the major players are. It notes issues with current analytics approaches being slow and expensive. The document introduces OpTier's approach of establishing real-time business context across transactions to more quickly gain insights. Potential use cases for financial services are also outlined, such as fraud prevention, customer behavior analysis, and understanding the impact of IT performance on business outcomes.
This document discusses big data and analytics, including how much data is being generated, what is driving this disruption, and who the major players are. It notes issues with current analytics approaches being slow and expensive. The document introduces OpTier's approach of establishing real-time business context across transactions to more quickly gain insights. Potential use cases for financial services are also outlined, such as fraud prevention, customer behavior analysis, and understanding the impact of IT performance on business outcomes.
This document discusses big data and analytics. It notes that digital data is growing exponentially and will reach 35 zettabytes by 2020, with 80% coming from enterprise systems. Big data is being driven by increased transaction data, interaction data from mobile and social media, and improved processing capabilities. Major players in big data include Google, Amazon, IBM and Microsoft. Traditional analytics struggle due to batch processing and lack of business context. The document introduces OpTier's approach of capturing real-time business context across interactions to enable insights with low costs and flexibility. Potential use cases for financial services are discussed.
Evolving analytics at ebay - 2012 Tableau Customer Conferencegdougan1
From Data to Knowledge: Evolving Analytics at ebay.
Gary Dougan's presentation at TCC 2012 (http://www.linkedin.com/in/garydougan)
Learn about eBay’s extensive analytics environment, and how eBay’s Business Intelligence platform team is enabling “visual analytics” across a complex ecosystem of platforms, technologies, and data enthusiasts, to synthesize information and derive insights from dynamic and complex data.
The document discusses OSCRE (Open Standards Consortium for Real Estate) Americas Inc., a non-profit organization that develops data standards to improve interoperability in the real estate industry. OSCRE relies on member contributions to support standard development activities. The document outlines how OSCRE standards can help reduce costs, improve efficiency, and enable real-time data sharing across the fragmented real estate industry. It provides examples of potential cost savings from implementing OSCRE standards and encourages additional industry participation.
2. Mission
2
REonomy’s mission is to revolutionize the
commercial real estate, related debt, and commercial
real estate backed securities investments industries
by providing industry professionals with easy to
access and actionable investment data and analytics
that are necessary to make accurate evaluation, loan,
trading and investment decisions.
Confidential
3. Target Markets
3
Collective CRE Market Covering >$11t in assets
~450k licensed commercial real estate brokers
280,000 mortgage bankers (The Mortgage Bankers Association)
9,757 RE focused institutional investment organizations
Bond traders and analysts, investment banks & loan originators
Private investors, developers, insurers, appraisers
Raw Data Market: Securities Analytics Market:
CRE brokerages currently paying $3k‐ CRE backed securities analytics and
ratings
$11k/broker/yr. for data
Institutional investors, investment
CRE brokerages, institutional banks, loan originators & bond
investors, mortgage bankers, and loan traders spend hundreds of thousands
originators currently spend $30k‐ of dollars on bond or securities
analytics and ratings
$100k/analyst/Yr. for data
Addressable Market Size: $2.5 BB Addressable Market Size: > $1 BB
Confidential
4. MARKET CONTEXT
4
Market Challenges REonomy Solution
Data availability, quality, latency, Solve all data issues
& structure issues
Unified, single‐source reference
Antiquated providers of single
raw data types
Extrapolate new critical data
Wild inaccuracies – garbage into
Streamlined & de‐specialized
models, garbage out
research tools and analytics
Cost‐prohibitive & inefficient
Extremely cost effective
Confidential
9. SOLUTION OVERVIEW
9
Industry “fragmented” Raw Data Feeds REonomy Proprietary Database Server REonomy Self‐Service Reporting and
capturing National, State and Local and Reporting and Analytics Data Analysis User Interface.
Data. Application Server.
Data Category: Source(s):
REIT Properties & Mortgage Financials Division of Federal Government
Parcel Data Division of Municipal Government REonomy
Entity Incorporation Filings Division of State Government REonomy Proprietary
Securitized Mortgage Over 60 Banks Proprietary Reporting and
Phone Numbers Phonebook Directories
Database Analytics App Server
Rent Control Status Division of State Government
Title Documents Division of Municipal Government
Server
Middle‐Market Mortgage Title Documents
Middle‐Market Mortgage Purchase ‐ $0.23/record of 4 data points
Building Permits & Violations Division of Municipal Government
Cross Referenced Trend Analysis
Property Taxes Division of Municipal Government
Real Property Income & Expense Division of Municipal Government
Unified Analytics
Space Availability ‐ For Lease Hundreds of Websites & Companies Aggregated Historical Analysis
Space Availability ‐ For Sale Hundreds of Websites & Companies Automated Market Analysis
Rolling Sales Division of Municipal Government Categorized Correlation Analysis
Tenant Information Phonebooks, Entity Search, Banks, Research
Cleansed REonomy Index™ Current Sample Reports:
Repurposed Data Raw Data
Intelligent Information Algorithm based
Physical characteristics of building and lot, breakout of space
usage
Itemized income and expense data, and comprehensive space
usage
Owner information – LLC and Name
Extensive Tax information, including city assessment values
All title documents including land data
Product demo and detailed data architecture available upon request
Product demo and detailed data architecture available upon request Complete zoning, landmark and air rights information
Debt performance, property occupancy, itemized income data,
effective gross income
Securitized property’s operating expenses, net operating
income and the debt service coverage ration for each tranche of
debt
Other….
Confidential
12. BUSINESS MODEL
12
Pricing Model
SaaS, Tiered data packages ‐ $140‐$250/seat/month
Multi‐Tiered (Premium) data delivery packages
Strictly company or division‐wide subscriptions
Future Products
CMBS & TLBS analytics
Portfolio risk analysis
CMBS & TLBS ratings as NRSRO
Commercial REIT coverage
REonomy Market Index™
Additional Revenue Streams
Data services for investment advisory firms
Confidential
13. Overview
13
Company Overview
Self-service data solution & SaaS analytics platform
Commercial real estate under increased regulation and market stress – strong need
for better data and quality decision tools
Strong customer value proposition with no switching costs
What We Want
Strategic Partners
Investors
Contact
Charles Oshman Office: 212.521.4137
Co‐Founder & CEO 590 Madison Avenue, 21st Floor
coshman@reonomy.com New York, NY 10022
Confidential