Presented by Marjorie Hlava, president of Access Innovations, Inc. on August 10, 2011. Part two of the Special Libraries Association's Leveraging Your Taxonomy series.
This presentation is at the beginners level and mainly focusses on how to create and consume OData service in ASP.NET. OData (Open Data Protocol) is a standardized protocol for creating and consuming data APIs through regular HTTP requests and REST.
Drilling Down to the Challenges of SharePoint Taxonomy ImplementationTSoholt
Webinar presented by Marjorie M.K. Hlava of Access Innovations, Inc. and Joe Shepley of Doculabs on August 10, 2011 for the American Society of Information Science & Technology.
Presented by Marjorie Hlava, president of Access Innovations, Inc. on August 10, 2011. Part two of the Special Libraries Association's Leveraging Your Taxonomy series.
This presentation is at the beginners level and mainly focusses on how to create and consume OData service in ASP.NET. OData (Open Data Protocol) is a standardized protocol for creating and consuming data APIs through regular HTTP requests and REST.
Drilling Down to the Challenges of SharePoint Taxonomy ImplementationTSoholt
Webinar presented by Marjorie M.K. Hlava of Access Innovations, Inc. and Joe Shepley of Doculabs on August 10, 2011 for the American Society of Information Science & Technology.
We're in a data-driven economy. Web API designers need to define what and how to expose data from a variety of apps, services, and stores. What are challenges of unlocking data and opening up access in a straightforward and standards-compliant manner? Is OData the right tool for the job?
Join Anant, Brian, and Greg for a discussion of OData, its API design implications, and the pros and cons of OData as an enabler of data integration and interoperability across Data APIs.
We Will Discuss »
- OData, SQL, and the "RESTification" of data - providing a uniform way to expose, structure, query and manipulate data using REST principles.
- Opportunity and challenges for OData.
- The questions of Web standards and proprietary versus open tools and protocols.
Building Client-side Search Applications with Solrlucenerevolution
Presented by Daniel Beach, Search Application Developer, OpenSource Connections
Solr is a powerful search engine, but creating a custom user interface can be daunting. In this fast paced session I will present an overview of how to implement a client-side search application using Solr. Using open-source frameworks like SpyGlass (to be released in September) can be a powerful way to jumpstart your development by giving you out-of-the box results views with support for faceting, autocomplete, and detail views. During this talk I will also demonstrate how we have built and deployed lightweight applications that are able to be performant under large user loads, with minimal server resources.
Building RESTful Applications with ODataTodd Anglin
Applications today are expected to expose their data and consume data-centric services via REST. In this session we discuss what REST is and have an overview of WCF Data Services and see how we can REST enable your data using the Open Data Protocol (OData). Then you will learn how to leverage existing skills related to Visual Studio, LINQ and data access to customize the behavior, control-flow, security model and experience of your data service. We will then see how to enable data-binding to traditional ASP.NET controls as well as Silverlight and Excel PowerPivot. We’ll then turn to consuming SharePoint and other OData based applications in .NET as well as from a non-Microsoft client. This is a very demo intensive session.
Applied Semantic Search with Microsoft SQL ServerMark Tabladillo
Text mining is projected to dominate data mining, and the reasons are evident: we have more text available than numeric data. Microsoft introduced a new technology to SQL Server 2012 called Semantic Search. This session's detailed description and demos give you important information for the enterprise implementation of Tag Index and Document Similarity Index. The demos include a web-based Silverlight application, and content documents from Wikipedia. We'll also look at strategy tips for how to best leverage the new semantic technology with existing Microsoft data mining.
The Open Data Protocol, or OData for short, provides a RESTful interface for CRUD operations against data services. OData services, such as Microsoft Azure, SAP, and WebSphere expose data and metadata as typed name/value pairs in JSON or XML, allowing 'off-the-shelf' data consumers to integrate with services without custom code. This session gives an overview of OData, and explains why salesforce.com selected it as a protocol to integrate with external data services.
OData presentation organized in ITC Hub Pancevo, Serbia, 10. Feb. 2018. https://www.meetup.com/Web-Development-Pancevo/events/247493392/. OData is enhancement of classic REST API concept that adds querying capabilities.
"Best Practice in API Design" talk given at phpday 2012 in Verona, Italy. This talk aims to give the best possible advice to anyone publishing a web service of any kind.
Extensible RESTful Applications with Apache TinkerPopVarun Ganesh
Presented at Graph Day SF 2018.
You are into data analytics. You come across a source of data and you realise that it is an intuitive case for a Knowledge Graph and that there is much value to be gained by incorporating it into one. How do you take this from zero to product while ensuring that it is well-tested, extensible, scalable and plays nicely with other components and services?
Slack, with its various interactions among its users is a prime candidate for this. Join us as we take you through our journey of conceptualizing Slack user data as a knowledge graph, evaluating different frameworks, incorporating business logic using TinkerPop with an extensible DSL and exposing it all through a familiar RESTful interface that allows us to effectively handle an ever-growing and dynamic graph.
A quick Description about presentation:
• What is ElasticSearch and how it works.
• How ElasticSearch works to analyze data splitting a document into meaningful portions and indexing each of those portions separately. So whenever a new search request comes in, it knows what to find.
• Features and advantages of ElasticSearch like built in sharding defaults, maintaining fail-safe node clusters, automatically adding a new node without having to reboot and so on.
• Out of the box features for today’s applications like faceted search, reverse search using Percolators and pre-built Analyzers.
The tutorial includes big data search, contenders, intro to elasticsearch, more than just search, unchartered territory. Beginning is a brief detail about big data search which includes big data search in terms of rapid consumption and the challenges faced by big data search. Following is a section about contenders. It includes contenders like lucene, apache soir, sphinx and ElasticSearch itself.
Moreover, there is also an introduction section to ElasticSearch. It includes an introduction to ElasticSearch as a search server and it's features like push replication, node auto discovery, fail-safe. It also includes data analyzing and ways of indexing it right. Afterwards, there is a section on more than search which includes factors more than just search functions like facets, range facet, histogram facet, geo facet, percolator and ElasticSearch percolating.
The last section of this tutorial includes unchartered territory. It includes territories like ElasticSearch and NoSQL database, situations in cases of WHAT IF and references.
We're in a data-driven economy. Web API designers need to define what and how to expose data from a variety of apps, services, and stores. What are challenges of unlocking data and opening up access in a straightforward and standards-compliant manner? Is OData the right tool for the job?
Join Anant, Brian, and Greg for a discussion of OData, its API design implications, and the pros and cons of OData as an enabler of data integration and interoperability across Data APIs.
We Will Discuss »
- OData, SQL, and the "RESTification" of data - providing a uniform way to expose, structure, query and manipulate data using REST principles.
- Opportunity and challenges for OData.
- The questions of Web standards and proprietary versus open tools and protocols.
Building Client-side Search Applications with Solrlucenerevolution
Presented by Daniel Beach, Search Application Developer, OpenSource Connections
Solr is a powerful search engine, but creating a custom user interface can be daunting. In this fast paced session I will present an overview of how to implement a client-side search application using Solr. Using open-source frameworks like SpyGlass (to be released in September) can be a powerful way to jumpstart your development by giving you out-of-the box results views with support for faceting, autocomplete, and detail views. During this talk I will also demonstrate how we have built and deployed lightweight applications that are able to be performant under large user loads, with minimal server resources.
Building RESTful Applications with ODataTodd Anglin
Applications today are expected to expose their data and consume data-centric services via REST. In this session we discuss what REST is and have an overview of WCF Data Services and see how we can REST enable your data using the Open Data Protocol (OData). Then you will learn how to leverage existing skills related to Visual Studio, LINQ and data access to customize the behavior, control-flow, security model and experience of your data service. We will then see how to enable data-binding to traditional ASP.NET controls as well as Silverlight and Excel PowerPivot. We’ll then turn to consuming SharePoint and other OData based applications in .NET as well as from a non-Microsoft client. This is a very demo intensive session.
Applied Semantic Search with Microsoft SQL ServerMark Tabladillo
Text mining is projected to dominate data mining, and the reasons are evident: we have more text available than numeric data. Microsoft introduced a new technology to SQL Server 2012 called Semantic Search. This session's detailed description and demos give you important information for the enterprise implementation of Tag Index and Document Similarity Index. The demos include a web-based Silverlight application, and content documents from Wikipedia. We'll also look at strategy tips for how to best leverage the new semantic technology with existing Microsoft data mining.
The Open Data Protocol, or OData for short, provides a RESTful interface for CRUD operations against data services. OData services, such as Microsoft Azure, SAP, and WebSphere expose data and metadata as typed name/value pairs in JSON or XML, allowing 'off-the-shelf' data consumers to integrate with services without custom code. This session gives an overview of OData, and explains why salesforce.com selected it as a protocol to integrate with external data services.
OData presentation organized in ITC Hub Pancevo, Serbia, 10. Feb. 2018. https://www.meetup.com/Web-Development-Pancevo/events/247493392/. OData is enhancement of classic REST API concept that adds querying capabilities.
"Best Practice in API Design" talk given at phpday 2012 in Verona, Italy. This talk aims to give the best possible advice to anyone publishing a web service of any kind.
Extensible RESTful Applications with Apache TinkerPopVarun Ganesh
Presented at Graph Day SF 2018.
You are into data analytics. You come across a source of data and you realise that it is an intuitive case for a Knowledge Graph and that there is much value to be gained by incorporating it into one. How do you take this from zero to product while ensuring that it is well-tested, extensible, scalable and plays nicely with other components and services?
Slack, with its various interactions among its users is a prime candidate for this. Join us as we take you through our journey of conceptualizing Slack user data as a knowledge graph, evaluating different frameworks, incorporating business logic using TinkerPop with an extensible DSL and exposing it all through a familiar RESTful interface that allows us to effectively handle an ever-growing and dynamic graph.
A quick Description about presentation:
• What is ElasticSearch and how it works.
• How ElasticSearch works to analyze data splitting a document into meaningful portions and indexing each of those portions separately. So whenever a new search request comes in, it knows what to find.
• Features and advantages of ElasticSearch like built in sharding defaults, maintaining fail-safe node clusters, automatically adding a new node without having to reboot and so on.
• Out of the box features for today’s applications like faceted search, reverse search using Percolators and pre-built Analyzers.
The tutorial includes big data search, contenders, intro to elasticsearch, more than just search, unchartered territory. Beginning is a brief detail about big data search which includes big data search in terms of rapid consumption and the challenges faced by big data search. Following is a section about contenders. It includes contenders like lucene, apache soir, sphinx and ElasticSearch itself.
Moreover, there is also an introduction section to ElasticSearch. It includes an introduction to ElasticSearch as a search server and it's features like push replication, node auto discovery, fail-safe. It also includes data analyzing and ways of indexing it right. Afterwards, there is a section on more than search which includes factors more than just search functions like facets, range facet, histogram facet, geo facet, percolator and ElasticSearch percolating.
The last section of this tutorial includes unchartered territory. It includes territories like ElasticSearch and NoSQL database, situations in cases of WHAT IF and references.
ElasticSearch - index server used as a document databaseRobert Lujo
Presentation held on 5.10.2014 on http://2014.webcampzg.org/talks/.
Although ElasticSearch (ES) primary purpose is to be used as index/search server, in its featureset ES overlaps with common NoSql database; better to say, document database.
Why this could be interesting and how this could be used effectively?
Talk overview:
- ES - history, background, philosophy, featureset overview, focus on indexing/search features
- short presentation on how to get started - installation, indexing and search/retrieving
- Database should provide following functions: store, search, retrieve -> differences between relational, document and search databases
- it is not unusual to use ES additionally as an document database (store and retrieve)
- an use-case will be presented where ES can be used as a single database in the system (benefits and drawbacks)
- what if a relational database is introduced in previosly demonstrated system (benefits and drawbacks)
ES is a nice and in reality ready-to-use example that can change perspective of development of some type of software systems.
2017 01-11 intelligent search and intranet - chihuahuas vs muffins v1Don Miller
This is a presentation for people looking to improve Enterprise Search and Intranets. It provides details around Microsoft Search, Azure Search and Elastic Search and how to take a basic search platform and transform it into what Gartner calls Insight Engines and what Forrester calls Cognitive Search and Knowledge Discovery.
OpenDistro for Elasticsearch and how Bitergia is using it.Madrid DevOpsjavier ramirez
Talk done at Madrid DevOps October 2019. Javier Ramirez @supercoco9, Tech Evangelist en AWS, y Jose Manrique @jsmanrique, experto en Open Source y Mobile y CEO de Bitergia en Octubre vienen a Madrid Devops para contarnos qué es Open Distro for Elasticsearch, en qué se diferencia de la distribución oficial de Elastic y qué beneficios te puede dar si ya estás usando Elasticsearch. Además te contaremos un caso real de una migración a Open Distro y cómo se está utilizando para analizar métricas de proyectos Open Source.
EPC Group - Comprehensive Overview of SharePoint 2010's Enterprise Search Cap...EPC Group
EPC Group - Comprehensive Overview of SharePoint 2010's Enterprise Search Capabilities - To assist with you roadmap planning and to help you and your organization with understanding what is possible with SharePoint 2010
ElasticSearch in Production: lessons learnedBeyondTrees
With Proquest Udini, we have created the worlds largest online article store, and aim to be the center for researchers all over the world. We connect to a 700M solr cluster for search, but have recently also implemented a search component with ElasticSearch. We will discuss how we did this, and how we want to use the 30M index for scientific citation recognition. We will highlight lessons learned in integrating ElasticSearch in our virtualized EC2 environments, and challenges aligning with our continuous deployment processes.
OpenFOAM solver for Helmholtz equation, helmholtzFoam / helmholtzBubbleFoamtakuyayamamoto1800
In this slide, we show the simulation example and the way to compile this solver.
In this solver, the Helmholtz equation can be solved by helmholtzFoam. Also, the Helmholtz equation with uniformly dispersed bubbles can be simulated by helmholtzBubbleFoam.
Into the Box Keynote Day 2: Unveiling amazing updates and announcements for modern CFML developers! Get ready for exciting releases and updates on Ortus tools and products. Stay tuned for cutting-edge innovations designed to boost your productivity.
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
First Steps with Globus Compute Multi-User EndpointsGlobus
In this presentation we will share our experiences around getting started with the Globus Compute multi-user endpoint. Working with the Pharmacology group at the University of Auckland, we have previously written an application using Globus Compute that can offload computationally expensive steps in the researcher's workflows, which they wish to manage from their familiar Windows environments, onto the NeSI (New Zealand eScience Infrastructure) cluster. Some of the challenges we have encountered were that each researcher had to set up and manage their own single-user globus compute endpoint and that the workloads had varying resource requirements (CPUs, memory and wall time) between different runs. We hope that the multi-user endpoint will help to address these challenges and share an update on our progress here.
top nidhi software solution freedownloadvrstrong314
This presentation emphasizes the importance of data security and legal compliance for Nidhi companies in India. It highlights how online Nidhi software solutions, like Vector Nidhi Software, offer advanced features tailored to these needs. Key aspects include encryption, access controls, and audit trails to ensure data security. The software complies with regulatory guidelines from the MCA and RBI and adheres to Nidhi Rules, 2014. With customizable, user-friendly interfaces and real-time features, these Nidhi software solutions enhance efficiency, support growth, and provide exceptional member services. The presentation concludes with contact information for further inquiries.
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Gamify Your Mind; The Secret Sauce to Delivering Success, Continuously Improv...Shahin Sheidaei
Games are powerful teaching tools, fostering hands-on engagement and fun. But they require careful consideration to succeed. Join me to explore factors in running and selecting games, ensuring they serve as effective teaching tools. Learn to maintain focus on learning objectives while playing, and how to measure the ROI of gaming in education. Discover strategies for pitching gaming to leadership. This session offers insights, tips, and examples for coaches, team leads, and enterprise leaders seeking to teach from simple to complex concepts.
Code reviews are vital for ensuring good code quality. They serve as one of our last lines of defense against bugs and subpar code reaching production.
Yet, they often turn into annoying tasks riddled with frustration, hostility, unclear feedback and lack of standards. How can we improve this crucial process?
In this session we will cover:
- The Art of Effective Code Reviews
- Streamlining the Review Process
- Elevating Reviews with Automated Tools
By the end of this presentation, you'll have the knowledge on how to organize and improve your code review proces
Check out the webinar slides to learn more about how XfilesPro transforms Salesforce document management by leveraging its world-class applications. For more details, please connect with sales@xfilespro.com
If you want to watch the on-demand webinar, please click here: https://www.xfilespro.com/webinars/salesforce-document-management-2-0-smarter-faster-better/
Advanced Flow Concepts Every Developer Should KnowPeter Caitens
Tim Combridge from Sensible Giraffe and Salesforce Ben presents some important tips that all developers should know when dealing with Flows in Salesforce.
Prosigns: Transforming Business with Tailored Technology SolutionsProsigns
Unlocking Business Potential: Tailored Technology Solutions by Prosigns
Discover how Prosigns, a leading technology solutions provider, partners with businesses to drive innovation and success. Our presentation showcases our comprehensive range of services, including custom software development, web and mobile app development, AI & ML solutions, blockchain integration, DevOps services, and Microsoft Dynamics 365 support.
Custom Software Development: Prosigns specializes in creating bespoke software solutions that cater to your unique business needs. Our team of experts works closely with you to understand your requirements and deliver tailor-made software that enhances efficiency and drives growth.
Web and Mobile App Development: From responsive websites to intuitive mobile applications, Prosigns develops cutting-edge solutions that engage users and deliver seamless experiences across devices.
AI & ML Solutions: Harnessing the power of Artificial Intelligence and Machine Learning, Prosigns provides smart solutions that automate processes, provide valuable insights, and drive informed decision-making.
Blockchain Integration: Prosigns offers comprehensive blockchain solutions, including development, integration, and consulting services, enabling businesses to leverage blockchain technology for enhanced security, transparency, and efficiency.
DevOps Services: Prosigns' DevOps services streamline development and operations processes, ensuring faster and more reliable software delivery through automation and continuous integration.
Microsoft Dynamics 365 Support: Prosigns provides comprehensive support and maintenance services for Microsoft Dynamics 365, ensuring your system is always up-to-date, secure, and running smoothly.
Learn how our collaborative approach and dedication to excellence help businesses achieve their goals and stay ahead in today's digital landscape. From concept to deployment, Prosigns is your trusted partner for transforming ideas into reality and unlocking the full potential of your business.
Join us on a journey of innovation and growth. Let's partner for success with Prosigns.
Strategies for Successful Data Migration Tools.pptxvarshanayak241
Data migration is a complex but essential task for organizations aiming to modernize their IT infrastructure and leverage new technologies. By understanding common challenges and implementing these strategies, businesses can achieve a successful migration with minimal disruption. Data Migration Tool like Ask On Data play a pivotal role in this journey, offering features that streamline the process, ensure data integrity, and maintain security. With the right approach and tools, organizations can turn the challenge of data migration into an opportunity for growth and innovation.
Multiple Your Crypto Portfolio with the Innovative Features of Advanced Crypt...Hivelance Technology
Cryptocurrency trading bots are computer programs designed to automate buying, selling, and managing cryptocurrency transactions. These bots utilize advanced algorithms and machine learning techniques to analyze market data, identify trading opportunities, and execute trades on behalf of their users. By automating the decision-making process, crypto trading bots can react to market changes faster than human traders
Hivelance, a leading provider of cryptocurrency trading bot development services, stands out as the premier choice for crypto traders and developers. Hivelance boasts a team of seasoned cryptocurrency experts and software engineers who deeply understand the crypto market and the latest trends in automated trading, Hivelance leverages the latest technologies and tools in the industry, including advanced AI and machine learning algorithms, to create highly efficient and adaptable crypto trading bots
We describe the deployment and use of Globus Compute for remote computation. This content is aimed at researchers who wish to compute on remote resources using a unified programming interface, as well as system administrators who will deploy and operate Globus Compute services on their research computing infrastructure.
Enhancing Research Orchestration Capabilities at ORNL.pdfGlobus
Cross-facility research orchestration comes with ever-changing constraints regarding the availability and suitability of various compute and data resources. In short, a flexible data and processing fabric is needed to enable the dynamic redirection of data and compute tasks throughout the lifecycle of an experiment. In this talk, we illustrate how we easily leveraged Globus services to instrument the ACE research testbed at the Oak Ridge Leadership Computing Facility with flexible data and task orchestration capabilities.
Accelerate Enterprise Software Engineering with PlatformlessWSO2
Key takeaways:
Challenges of building platforms and the benefits of platformless.
Key principles of platformless, including API-first, cloud-native middleware, platform engineering, and developer experience.
How Choreo enables the platformless experience.
How key concepts like application architecture, domain-driven design, zero trust, and cell-based architecture are inherently a part of Choreo.
Demo of an end-to-end app built and deployed on Choreo.
Modern design is crucial in today's digital environment, and this is especially true for SharePoint intranets. The design of these digital hubs is critical to user engagement and productivity enhancement. They are the cornerstone of internal collaboration and interaction within enterprises.
7. Understanding Search
Why do we search?
Basic Human instinct We want answers→
Humans are information-driven Knowledge→
is key to survive
8. Understanding Search
Why do we search?
Basic Human instinct We want answers→
Humans are information-driven Knowledge is key to survive→
Brainless Modeling the world→
9. Understanding Search
Why do we search?
Basic Human instinct We want answers→
Humans are information-driven Knowledge is key to survive→
Brainless Modeling the world→
The River and the Fish Information is fooding us, but most of→
the time we’re interested in only a specific slice of the“big cake”
29. Understanding Search
“The Search”: User Input
What makes it complex?
- Creating an intuitive yet powerful frontend (UI)
- Expose syntax, but not overwhelm end-user
- Features to provide:
- Faceted Search
- Search suggestions, Live Search
- Related searches
- Default View: Basic or Advanced
- Type of user input: keywords, image, geolocation (coordinates)
32. Understanding Search
“The Search”: Magic
What makes it complex?
- Different types of content → Abstraction, Transformation
- Different repositories → Integration
- Language characteristics ( Site & Content Localization)←
- Performance, Permissions, restrictions → Operation, Implementation, Business needs
- Mapping user-thinking → Queries
- Scaleability → Operation
- Finding the good platform → Architecture
38. Understanding Search
“The Search”: Content
The world is not ideal →
Data comes in different forms →
Search should operate on a common
form
39. Understanding Search
“The Search”: Content
The world is not ideal →
Data comes in different forms →
Search should operate on a common form
← Transparency for end-user
40. Understanding Search
Summary
What we need?
- Data Transformation → Pre-Search
- Search Backend → Search → Magic :-)
- Search Frontend → Couldn’t find a good word for this, but writing something just
to keep the format :-)
43. Technology Stack
LuceneTM
Full-featured text search engine library
Java-based
Document model
Provides indexing, scoring, spellchecking, hit highlighting and advanced
analysis/tokenization capabilities
Provides a wide range of queries
Open Source (Apache License 2.0)
49. Technology Stack
Lucene: Search Index vs. DB
- Different purposes
- DB: permanent storage, data-driven, application-centric
- Search Engine, Index: optimized for search, user-centric
- Search in search engine is about relevance
- Database searches does not provide us with fuzzy searching or any type of relevancy
- Can apply algorithms like“More Like This”to obtain similar content
- Advanced Features
- Geolocation
- Faceting of results
- Multi-lingual searching
51. Technology Stack
Solr
Highly reliable, scalable and fault tolerant
Provides distributed indexing, replication and load-balanced querying, automated
failover and recovery, centralized configuration and more
Standalone enterprise search server (web application) with a REST-like API
Wide range of clients
Apache Foundation
58. Technology Stack
Elasticsearch: Features
Massively Distributed: ES allows you to start small and scale horizontally as you grow.
Simply add more nodes, and let the cluster automatically take advantage of the extra
hardware
73. Search in Liferay
The Liferay Search Infrastructure
Liferay
Platform
Assets:
web content,
message boards,
wiki pages...
Search
infrastructure
(Magic
happens
here)
Search engine(s)
Indices,
documents,
analysis...
74. → 2004: Lucene: from the very beginnings
- Liferay Search APIs: built around/on top of Lucene, default engine
→ 2008 (5.1): abstracting out search mechanism → Solr integration (plugin)
→ 2011 (6.0): Faceted Search Support
Search in Liferay
The Evolution of Search
75. → 2011-2014: (6.0-6.2): enhancing search functionalities
- Suggestions,“Did You Mean”, filtering, improved permission awareness
→ 2016 (7.0): Elasticsearch integration
- Default search engine
- Generic queries
- OSGi module
- Partnership with Elastic
Search in Liferay
The Evolution of Search
76. Best of breed
Built for modern web applications
Distributed and clusterable by design
Lucene based
Great vendor support
Great monitoring tools: Marvel
Search in Liferay
Why Elasticsearch?
77. Open Source
Amazing documentation
High "just works" factor, e.g. zero-config indexing and clustering
REST for queries, health, admin - everything
Great Java Client API
Pretty JSON for talks ;-)
Search in Liferay
Great for Developers
80. Shield | Security for Elasticsearch
Protect your Liferay index with a username and password
SSL/TLS encryption for traffic within the Liferay Elasticsearch cluster
Elasticsearch plugin - no need for an external security solution
Restrict access to Liferay Portal instances with IP filtering
Liferay Integration
Search in Liferay
Liferay 7 EE Search Features: Security
81. Marvel | Monitor for Elasticsearch
Real-Time and Historical Analysis
Real-Time Cluster Health at a Glance
Multicluster Support
Liferay Integration
Search in Liferay
Liferay 7 EE Search Features: Monitoring
82.
83. Details are coming soon...
Search in Liferay
Liferay 7 EE Search Features: Licensing
84. Resources
André de Oliveira: Harnessing the power of search, Liferay DEVCON, 2015
Michael Han: Search Reference documentation (Liferay Internal Documentation)
Tibor Lipusz: Introduction to Indexing & Searching, Faceted Search in Liferay (Liferay Internal Training)
Tibor Lipusz: Introduction to Search: Elasticsearch and Solr (Liferay Internal Training)
Source of Images, Figures:
www.liferay.com
https://dev.liferay.com
www.amazon.com
www.google.com
http://lucene.apache.org
www.elastic.co
http://lucene.apache.org/solr/