The document describes MNPR, a framework for real-time expressive non-photorealistic rendering of 3D computer graphics. The framework aims to provide a wide spectrum of control over NPR styles through various levels of art direction, enable cross-stylization between different styles, and offer implementation insights through open source code. It addresses limitations of previous work such as narrow control, proprietary nature, and lack of support for cross-stylization between styles.
3D Internet is a powerful new way for you to reach consumers, business customers, co-workers, partners and students. Also known as virtual worlds, it combines the immediacy of television, the versatile content of the web, and the relationship building strengths of social networking sites like FACEBOOK, yet unlike the passive experience of television, the 3D internet is inherently interactive and engaging. Virtual words provide immersive 3D experiences that replicate real life.
People who take part in virtual worlds stay online longer with a heightened level of interest. To take advantage of that interest, diverse business and organizations have claimed an early stake in this fast-growing market. They include technology leaders such as IBM, Microsoft and cisco, companies such as BMW, Toyota, Circuit City, Coco Cola, and Calvin Klein, and scores of universities, including Harvard, Stanford and Penn State.
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/qualcomm/embedded-vision-training/videos/pages/may-2016-embedded-vision-summit-mangen
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Michael Mangen, Product Manager for Camera and Computer Vision at Qualcomm, presents the "High-resolution 3D Reconstruction on a Mobile Processor" tutorial at the May 2016 Embedded Vision Summit.
Computer vision has come a long way. Use cases that were previously not possible in mass-market devices are now more accessible thanks to advances in depth sensors and mobile processors. In this presentation, Mangen provides an overview of how we are able to implement high-resolution 3D reconstruction – a capability typically requiring cloud/server processing – on a mobile processor. This is an exciting example of how new sensor technology and advanced mobile processors are bringing computer vision capabilities to broader markets.
Integration of a Structure from Motion into Virtual and Augmented Reality for...Tomohiro Fukuda
Proceedings (Full paper reviewed)
Tomohiro Fukuda, Hideki Nada, Haruo Adachi, Shunta Shimizu, Chikako Takei, Yusuke Sato, Nobuyoshi Yabuki, and Ali Motamedi: 2017, Integration of a Structure from Motion into Virtual and Augmented Reality for Architectural and Urban Simulation: Demonstrated in Real Architectural and Urban Projects, Future Trajectories of Computation in Design: 17th International Conference CAAD Futures 2017, p.596, 2017.7
Book (Book contribution)
Tomohiro Fukuda, Hideki Nada, Haruo Adachi, Shunta Shimizu, Chikako Takei, Yusuke Sato, Nobuyoshi Yabuki, and Ali Motamedi: 2017, Integration of a Structure from Motion into Virtual and Augmented Reality for Architectural and Urban Simulation: Demonstrated in Real Architectural and Urban Projects, Computer-Aided Architectural Design - Future Trajectories,pp.60-77,Springer (Communications in Computer and Information Science 724), ISSN 1865-0929,ISBN 978-981-10-5196-8,2017.7
Computational visual simulations are extremely useful and powerful tools for decision-making. The use of virtual and augmented reality (VR/AR) has become a common phenomenon due to real-time and interactive visual simulation tools in architectural and urban design studies and presentations. In this study, a demonstration is performed to integrate structure from motion (SfM) into VR and AR. A 3D modeling method is explored by SfM under real-time rendering as a solution for the modeling cost in large-scale VR. The study examines the application of camera parameters of SfM to realize an appropriate registration and tracking accuracy in marker-less AR to visualize full-scale design projects on a planned construction site. The proposed approach is applied to plural real architectural and urban design projects, and results indicate the feasibility and effectiveness of the proposed approach.
2019年6月13日、SSII2019 Organized Session: Multimodal 4D sensing。エンドユーザー向け SLAM 技術の現在。登壇者:武笠 知幸(Research Scientist, Rakuten Institute of Technology)
https://confit.atlas.jp/guide/event/ssii2019/static/organized#OS2
3D Internet is a powerful new way for you to reach consumers, business customers, co-workers, partners and students. Also known as virtual worlds, it combines the immediacy of television, the versatile content of the web, and the relationship building strengths of social networking sites like FACEBOOK, yet unlike the passive experience of television, the 3D internet is inherently interactive and engaging. Virtual words provide immersive 3D experiences that replicate real life.
People who take part in virtual worlds stay online longer with a heightened level of interest. To take advantage of that interest, diverse business and organizations have claimed an early stake in this fast-growing market. They include technology leaders such as IBM, Microsoft and cisco, companies such as BMW, Toyota, Circuit City, Coco Cola, and Calvin Klein, and scores of universities, including Harvard, Stanford and Penn State.
For the full video of this presentation, please visit:
http://www.embedded-vision.com/platinum-members/qualcomm/embedded-vision-training/videos/pages/may-2016-embedded-vision-summit-mangen
For more information about embedded vision, please visit:
http://www.embedded-vision.com
Michael Mangen, Product Manager for Camera and Computer Vision at Qualcomm, presents the "High-resolution 3D Reconstruction on a Mobile Processor" tutorial at the May 2016 Embedded Vision Summit.
Computer vision has come a long way. Use cases that were previously not possible in mass-market devices are now more accessible thanks to advances in depth sensors and mobile processors. In this presentation, Mangen provides an overview of how we are able to implement high-resolution 3D reconstruction – a capability typically requiring cloud/server processing – on a mobile processor. This is an exciting example of how new sensor technology and advanced mobile processors are bringing computer vision capabilities to broader markets.
Integration of a Structure from Motion into Virtual and Augmented Reality for...Tomohiro Fukuda
Proceedings (Full paper reviewed)
Tomohiro Fukuda, Hideki Nada, Haruo Adachi, Shunta Shimizu, Chikako Takei, Yusuke Sato, Nobuyoshi Yabuki, and Ali Motamedi: 2017, Integration of a Structure from Motion into Virtual and Augmented Reality for Architectural and Urban Simulation: Demonstrated in Real Architectural and Urban Projects, Future Trajectories of Computation in Design: 17th International Conference CAAD Futures 2017, p.596, 2017.7
Book (Book contribution)
Tomohiro Fukuda, Hideki Nada, Haruo Adachi, Shunta Shimizu, Chikako Takei, Yusuke Sato, Nobuyoshi Yabuki, and Ali Motamedi: 2017, Integration of a Structure from Motion into Virtual and Augmented Reality for Architectural and Urban Simulation: Demonstrated in Real Architectural and Urban Projects, Computer-Aided Architectural Design - Future Trajectories,pp.60-77,Springer (Communications in Computer and Information Science 724), ISSN 1865-0929,ISBN 978-981-10-5196-8,2017.7
Computational visual simulations are extremely useful and powerful tools for decision-making. The use of virtual and augmented reality (VR/AR) has become a common phenomenon due to real-time and interactive visual simulation tools in architectural and urban design studies and presentations. In this study, a demonstration is performed to integrate structure from motion (SfM) into VR and AR. A 3D modeling method is explored by SfM under real-time rendering as a solution for the modeling cost in large-scale VR. The study examines the application of camera parameters of SfM to realize an appropriate registration and tracking accuracy in marker-less AR to visualize full-scale design projects on a planned construction site. The proposed approach is applied to plural real architectural and urban design projects, and results indicate the feasibility and effectiveness of the proposed approach.
2019年6月13日、SSII2019 Organized Session: Multimodal 4D sensing。エンドユーザー向け SLAM 技術の現在。登壇者:武笠 知幸(Research Scientist, Rakuten Institute of Technology)
https://confit.atlas.jp/guide/event/ssii2019/static/organized#OS2
Generating 3 d model in virtual reality and analyzing its performanceijcsit
In this paper is presented an virtual environment of a real model. Here are given all analyzes for
making and vizualization of virtual environment in Quest3D. All analyzes of performance of the system in
real time is presented.We described advantages and disadvantages of interactions in virtual environment
and made a critical analysis on a rendering speed and quality on different machines
Transformer Architectures in Vision
[2018 ICML] Image Transformer
[2019 CVPR] Video Action Transformer Network
[2020 ECCV] End-to-End Object Detection with Transformers
[2021 ICLR] An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Data Scenarios 2020: 6 Amazing TransformationsSafe Software
We’ll take you through the most cutting-edge scenarios our team has been working on over the last year, including applying machine learning to geospatial data, real-world use cases for immersive environments, photogrammetry, and more.
Computer Graphics and its applications, Elements of a Graphics, Graphics Systems: Video Display Devices, Raster Scan Systems, Random Scan Systems, Input devices.
: Introduction of Rendering, Raytracing, Antialiasing, Fractals
Invited talk on AR/SLAM and IoT in ILAS Seminar :Introduction to IoT and
Security, Kyoto University, 2020.
(https://www.z.k.kyoto-u.ac.jp/freshman-guide/ilas-seminars/ )
◆登壇者: Tomoyuki Mukasa
Digital Dynamic Markers in augmented RealityAyan Sinha
this presentation tries to throw some light how human can interact with virtual world which provide meta data about the subject by adding or removing some information from it
Wireless network implementation is a viable option for building network infrastructure in rural communities. Rural people lack network infrastructures for information services and socio-economic development. The aim of this study was to develop a wireless network infrastructure architecture for network services to rural dwellers. A user-centered approach was applied in the study and a wireless network infrastructure was designed and deployed to cover five rural locations. Data was collected and analyzed to assess the performance of the network facilities. The results shows that the system had been performing adequately without any downtime with an average of 200 users per month and the quality of service has remained high. The transmit/receive rate of 300Mbps was thrice as fast as the normal Ethernet transmit/receive specification with an average throughput of 1 Mbps. The multiple output/multiple input (MIMO) point-to-multipoint network design increased the network throughput and the quality of service experienced by the users.
3D reconstruction is a technique used in computer vision which has a wide range of applications in areas like object recognition, city modelling, virtual reality, physical simulations, video games and special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required. Such systems were often very expensive and was only available for industrial or research purpose. With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition, the goal of this work also included making the 3D scanning process fully automated by building and integrating a turntable alongside the software. This means the user can perform a full 3D scan only by a press of a few buttons from our dedicated graphical user interface. Three main steps were followed to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and convert the acquired point cloud data into a watertight mesh of good quality. Third, export the reconstructed model to a 3D printer to obtain a proper 3D print of the model.
3D reconstruction is a technique used in computer vision which has a wide range of applications in areas like object recognition, city modelling, virtual reality, physical simulations, video games and special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required. Such systems were often very expensive and was only available for industrial or research purpose. With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition, the goal of this work also included making the 3D scanning process fully automated by building and integrating a turntable alongside the software. This means the user can perform a full 3D scan only by a press of a few buttons from our dedicated graphical user interface. Three main steps were followed to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and convert the acquired point cloud data into a watertight mesh of good quality. Third, export the reconstructed model to a 3D printer to obtain a proper 3D print of the model.
COMPLETE END-TO-END LOW COST SOLUTION TO A 3D SCANNING SYSTEM WITH INTEGRATED...ijcsit
3D reconstruction is a technique used in computer vision which has a wide range of applications in
areas like object recognition, city modelling, virtual reality, physical simulations, video games and
special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required.
Such systems were often very expensive and was only available for industrial or research purpose.
With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design
inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and
processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition,
the goal of this work also included making the 3D scanning process fully automated by building and
integrating a turntable alongside the software. This means the user can perform a full 3D scan only by
a press of a few buttons from our dedicated graphical user interface. Three main steps were followed
to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system
acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and
convert the acquired point cloud data into a watertight mesh of good quality. Third, export the
reconstructed model to a 3D printer to obtain a proper 3D print of the model.
With each of the past 3 Ruby releases, YJIT has delivered higher and higher performance. However, we are seeing diminishing returns, because as JIT-compiled code becomes faster, it makes up less and less of the total execution time, which is now becoming dominated by C function calls. As such, it may appear like there is a fundamental limit to Ruby’s performance.
In the first half of the 20th century, some early airplane designers thought that the speed of sound was a fundamental limit on the speed reachable by airplanes, thus coining the term “sound barrier”. This limit was eventually overcome, as it became understood that airflow behaves differently at supersonic speeds.
In order to break the Ruby performance barrier, it will be necessary to reduce the dependency on C extensions, and start writing more gems in pure Ruby code. In this talk, I want to look at this problem more in depth, and explore how YJIT can help enable writing pure-Ruby software that delivers high performance levels.
More Related Content
Similar to MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Graphics
Generating 3 d model in virtual reality and analyzing its performanceijcsit
In this paper is presented an virtual environment of a real model. Here are given all analyzes for
making and vizualization of virtual environment in Quest3D. All analyzes of performance of the system in
real time is presented.We described advantages and disadvantages of interactions in virtual environment
and made a critical analysis on a rendering speed and quality on different machines
Transformer Architectures in Vision
[2018 ICML] Image Transformer
[2019 CVPR] Video Action Transformer Network
[2020 ECCV] End-to-End Object Detection with Transformers
[2021 ICLR] An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale
Data Scenarios 2020: 6 Amazing TransformationsSafe Software
We’ll take you through the most cutting-edge scenarios our team has been working on over the last year, including applying machine learning to geospatial data, real-world use cases for immersive environments, photogrammetry, and more.
Computer Graphics and its applications, Elements of a Graphics, Graphics Systems: Video Display Devices, Raster Scan Systems, Random Scan Systems, Input devices.
: Introduction of Rendering, Raytracing, Antialiasing, Fractals
Invited talk on AR/SLAM and IoT in ILAS Seminar :Introduction to IoT and
Security, Kyoto University, 2020.
(https://www.z.k.kyoto-u.ac.jp/freshman-guide/ilas-seminars/ )
◆登壇者: Tomoyuki Mukasa
Digital Dynamic Markers in augmented RealityAyan Sinha
this presentation tries to throw some light how human can interact with virtual world which provide meta data about the subject by adding or removing some information from it
Wireless network implementation is a viable option for building network infrastructure in rural communities. Rural people lack network infrastructures for information services and socio-economic development. The aim of this study was to develop a wireless network infrastructure architecture for network services to rural dwellers. A user-centered approach was applied in the study and a wireless network infrastructure was designed and deployed to cover five rural locations. Data was collected and analyzed to assess the performance of the network facilities. The results shows that the system had been performing adequately without any downtime with an average of 200 users per month and the quality of service has remained high. The transmit/receive rate of 300Mbps was thrice as fast as the normal Ethernet transmit/receive specification with an average throughput of 1 Mbps. The multiple output/multiple input (MIMO) point-to-multipoint network design increased the network throughput and the quality of service experienced by the users.
3D reconstruction is a technique used in computer vision which has a wide range of applications in areas like object recognition, city modelling, virtual reality, physical simulations, video games and special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required. Such systems were often very expensive and was only available for industrial or research purpose. With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition, the goal of this work also included making the 3D scanning process fully automated by building and integrating a turntable alongside the software. This means the user can perform a full 3D scan only by a press of a few buttons from our dedicated graphical user interface. Three main steps were followed to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and convert the acquired point cloud data into a watertight mesh of good quality. Third, export the reconstructed model to a 3D printer to obtain a proper 3D print of the model.
3D reconstruction is a technique used in computer vision which has a wide range of applications in areas like object recognition, city modelling, virtual reality, physical simulations, video games and special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required. Such systems were often very expensive and was only available for industrial or research purpose. With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition, the goal of this work also included making the 3D scanning process fully automated by building and integrating a turntable alongside the software. This means the user can perform a full 3D scan only by a press of a few buttons from our dedicated graphical user interface. Three main steps were followed to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and convert the acquired point cloud data into a watertight mesh of good quality. Third, export the reconstructed model to a 3D printer to obtain a proper 3D print of the model.
COMPLETE END-TO-END LOW COST SOLUTION TO A 3D SCANNING SYSTEM WITH INTEGRATED...ijcsit
3D reconstruction is a technique used in computer vision which has a wide range of applications in
areas like object recognition, city modelling, virtual reality, physical simulations, video games and
special effects. Previously, to perform a 3D reconstruction, specialized hardwares were required.
Such systems were often very expensive and was only available for industrial or research purpose.
With the rise of the availability of high-quality low cost 3D sensors, it is now possible to design
inexpensive complete 3D scanning systems. The objective of this work was to design an acquisition and
processing system that can perform 3D scanning and reconstruction of objects seamlessly. In addition,
the goal of this work also included making the 3D scanning process fully automated by building and
integrating a turntable alongside the software. This means the user can perform a full 3D scan only by
a press of a few buttons from our dedicated graphical user interface. Three main steps were followed
to go from acquisition of point clouds to the finished reconstructed 3D model. First, our system
acquires point cloud data of a person/object using inexpensive camera sensor. Second, align and
convert the acquired point cloud data into a watertight mesh of good quality. Third, export the
reconstructed model to a 3D printer to obtain a proper 3D print of the model.
Similar to MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Graphics (20)
With each of the past 3 Ruby releases, YJIT has delivered higher and higher performance. However, we are seeing diminishing returns, because as JIT-compiled code becomes faster, it makes up less and less of the total execution time, which is now becoming dominated by C function calls. As such, it may appear like there is a fundamental limit to Ruby’s performance.
In the first half of the 20th century, some early airplane designers thought that the speed of sound was a fundamental limit on the speed reachable by airplanes, thus coining the term “sound barrier”. This limit was eventually overcome, as it became understood that airflow behaves differently at supersonic speeds.
In order to break the Ruby performance barrier, it will be necessary to reduce the dependency on C extensions, and start writing more gems in pure Ruby code. In this talk, I want to look at this problem more in depth, and explore how YJIT can help enable writing pure-Ruby software that delivers high performance levels.
Kubernetes & AI - Beauty and the Beast !?! @KCD Istanbul 2024Tobias Schneck
As AI technology is pushing into IT I was wondering myself, as an “infrastructure container kubernetes guy”, how get this fancy AI technology get managed from an infrastructure operational view? Is it possible to apply our lovely cloud native principals as well? What benefit’s both technologies could bring to each other?
Let me take this questions and provide you a short journey through existing deployment models and use cases for AI software. On practical examples, we discuss what cloud/on-premise strategy we may need for applying it to our own infrastructure to get it to work from an enterprise perspective. I want to give an overview about infrastructure requirements and technologies, what could be beneficial or limiting your AI use cases in an enterprise environment. An interactive Demo will give you some insides, what approaches I got already working for real.
Essentials of Automations: Optimizing FME Workflows with ParametersSafe Software
Are you looking to streamline your workflows and boost your projects’ efficiency? Do you find yourself searching for ways to add flexibility and control over your FME workflows? If so, you’re in the right place.
Join us for an insightful dive into the world of FME parameters, a critical element in optimizing workflow efficiency. This webinar marks the beginning of our three-part “Essentials of Automation” series. This first webinar is designed to equip you with the knowledge and skills to utilize parameters effectively: enhancing the flexibility, maintainability, and user control of your FME projects.
Here’s what you’ll gain:
- Essentials of FME Parameters: Understand the pivotal role of parameters, including Reader/Writer, Transformer, User, and FME Flow categories. Discover how they are the key to unlocking automation and optimization within your workflows.
- Practical Applications in FME Form: Delve into key user parameter types including choice, connections, and file URLs. Allow users to control how a workflow runs, making your workflows more reusable. Learn to import values and deliver the best user experience for your workflows while enhancing accuracy.
- Optimization Strategies in FME Flow: Explore the creation and strategic deployment of parameters in FME Flow, including the use of deployment and geometry parameters, to maximize workflow efficiency.
- Pro Tips for Success: Gain insights on parameterizing connections and leveraging new features like Conditional Visibility for clarity and simplicity.
We’ll wrap up with a glimpse into future webinars, followed by a Q&A session to address your specific questions surrounding this topic.
Don’t miss this opportunity to elevate your FME expertise and drive your projects to new heights of efficiency.
UiPath Test Automation using UiPath Test Suite series, part 3DianaGray10
Welcome to UiPath Test Automation using UiPath Test Suite series part 3. In this session, we will cover desktop automation along with UI automation.
Topics covered:
UI automation Introduction,
UI automation Sample
Desktop automation flow
Pradeep Chinnala, Senior Consultant Automation Developer @WonderBotz and UiPath MVP
Deepak Rai, Automation Practice Lead, Boundaryless Group and UiPath MVP
Elevating Tactical DDD Patterns Through Object CalisthenicsDorra BARTAGUIZ
After immersing yourself in the blue book and its red counterpart, attending DDD-focused conferences, and applying tactical patterns, you're left with a crucial question: How do I ensure my design is effective? Tactical patterns within Domain-Driven Design (DDD) serve as guiding principles for creating clear and manageable domain models. However, achieving success with these patterns requires additional guidance. Interestingly, we've observed that a set of constraints initially designed for training purposes remarkably aligns with effective pattern implementation, offering a more ‘mechanical’ approach. Let's explore together how Object Calisthenics can elevate the design of your tactical DDD patterns, offering concrete help for those venturing into DDD for the first time!
Software Delivery At the Speed of AI: Inflectra Invests In AI-Powered QualityInflectra
In this insightful webinar, Inflectra explores how artificial intelligence (AI) is transforming software development and testing. Discover how AI-powered tools are revolutionizing every stage of the software development lifecycle (SDLC), from design and prototyping to testing, deployment, and monitoring.
Learn about:
• The Future of Testing: How AI is shifting testing towards verification, analysis, and higher-level skills, while reducing repetitive tasks.
• Test Automation: How AI-powered test case generation, optimization, and self-healing tests are making testing more efficient and effective.
• Visual Testing: Explore the emerging capabilities of AI in visual testing and how it's set to revolutionize UI verification.
• Inflectra's AI Solutions: See demonstrations of Inflectra's cutting-edge AI tools like the ChatGPT plugin and Azure Open AI platform, designed to streamline your testing process.
Whether you're a developer, tester, or QA professional, this webinar will give you valuable insights into how AI is shaping the future of software delivery.
Let's dive deeper into the world of ODC! Ricardo Alves (OutSystems) will join us to tell all about the new Data Fabric. After that, Sezen de Bruijn (OutSystems) will get into the details on how to best design a sturdy architecture within ODC.
NEWNTIDE, a leading brand in China's air energy industry, drives industry development with technological innovation, implementing national energy-saving and emission reduction policies. It pioneers an industry-focused multi-energy product line, adopting experiential marketing to meet diverse customer needs. The company has departments for R&D, marketing, operations, and sales, aiming to ultimately achieve "technological innovation, environmental friendliness, standardized management, and high-quality" as a high-tech enterprise integrating business and technical R&D, production, sales, and service.
NEWNTIDE boasts the most comprehensive support service network in the industry. Its earliest products cover 25 series, including split, integrated, wall-mounted, cabinet, and upright types, with over 100 diverse products. Commercial products include floor heating, air heaters, air conditioners for heating and cooling, oxidation and nitrogen air conditioners, and high-temperature heating. The products feature comprehensive intelligent technology management, cloud control technology, rapid heating technology, basic protection technology, remote control technology, DC inverter technology, and remote WIFI smart control, achieving a leading position in the industry with SMART interactive technology.
For over a decade, the company has adhered to a "people-oriented" business philosophy, strictly implementing industry 7S management, ISO9001/ISO14001 quality and environmental systems, and industry standards to ensure stable product quality and meet customers' dual requirements for product safety and environmental protection.
Leading the development of intelligence with technological innovation, NEWNTIDE has become a national demonstration base for the transformation of scientific and technological achievements, awarded the "China Energy Saving Technology Contribution Award" and "China Energy Science and Technology Progress Award". The company adopts a strategy of high standards, high quality, and high-tech for key products, holding core technologies and competitive advantages. It also organizes multiple strategic support projects known as the "18 Key Operational Projects" and "18 Key Operational Strategies," driving technology project approvals with multidimensional strategic product quality modules and comprehensive practical operations to enhance the quality of all products.
Since its establishment, NEWNTIDE has always committed to providing high-quality and high-end intelligent heat pump products, serving billions of global families with the goal of creating a sustainable and prosperous environment. The development of NEWNTIDE has been supported by various levels of government and widely recognized and cooperated with by internationally renowned institutions, taking on a social responsibility of providing tranquility and happiness while enjoying the environment.
Let safe heat pumps be a necessity for a beautiful human life.
Slack (or Teams) Automation for Bonterra Impact Management (fka Social Soluti...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on the notifications, alerts, and approval requests using Slack for Bonterra Impact Management. The solutions covered in this webinar can also be deployed for Microsoft Teams.
Interested in deploying notification automations for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Knowledge engineering: from people to machines and back
MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Graphics
1. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
A n I G S S t u d e n t S e m i n a r
b y S a n t i a g o M o n t e s d e o c a
1 6 M a r c h 2 0 1 6
2. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Contribution
3. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Contribution
• Art-direction covering the interaction spectrum within NPR
4. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Contribution
• Art-direction covering the interaction spectrum within NPR
• User-study---validate usefulness of each level of control
5. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Contribution
• Art-direction covering the interaction spectrum within NPR
• User-study---validate usefulness of each level of control
• Control semantics for cross-stylization
6. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Contribution
• Art-direction covering the interaction spectrum within NPR
• User-study---validate usefulness of each level of control
• Control semantics for cross-stylization
• Implementation insights and source code of MNPR
7. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Inuciian
Contribution
8. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Inuciian
Contribution
9. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Inuciian
Contribution
10. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Inuciian
Contribution
11. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Inuciian
Contribution
12. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Motivation
Artistic need
13. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Wide research in NPR
Motivation
14. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Wide research in NPR
Motivation
but limited application in 3D
15. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
NPR is not expressive by itself
Motivation
16. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer GraphicsMNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by OculusMotivation
17. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer GraphicsMNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by OculusMotivation
18. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer GraphicsMNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by OculusMotivation
19. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Tom RobinsonMotivation
20. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Tom RobinsonMotivation
21. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Tom RobinsonMotivation
22. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Every style needs art-direction
Motivation
23. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Every style needs art-direction
Motivation
Art-direction can cross styles
24. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
NPR is not expressive by itself
Paintings by Dylan Scott Pierce
25. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
“Overall, NPR researchers might put more emphasis on assisting art creation, rather
than automating it.”
Holger Winnemöller, 2013
Motivation
26. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
“Overall, NPR researchers might put more emphasis on assisting art creation, rather
than automating it.”
Holger Winnemöller, 2013
“As researchers, however, we should focus—in addition to working on algorithmic
contributions—on how to design the interaction with the algorithmic support”
Tobias Isenberg, 2016
Motivation
27. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
No intuitive and generalized approach
towards expressive NPR in 3D
Motivation
28. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Related Work
3D Stylization
29. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D stroke-space stylization
Related Work
30. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D stroke-space stylization
• Seminal work – Hanrahan and Haeberli 1990
[Hanrahan and Haeberli 1990]
Related Work
31. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D stroke-space stylization
• Seminal work – Hanrahan and Haeberli 1990
• Deep Canvas – Daniels 1999
[Hanrahan and Haeberli 1990]
Related Work
32. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D stroke-space stylization
• Seminal work – Hanrahan and Haeberli 1990
• Deep Canvas – Daniels 1999
• OverCoat – Schmid et al. 2011
[Hanrahan and Haeberli 1990]
[Schmid et al. 2011]
Related Work
33. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Specialized frameworks
Related Work
34. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Specialized frameworks
• Jot (linework) – Kalnins et al. 2003
Related Work
[Kalnins et al. 2003]
35. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Specialized frameworks
• Jot (linework) – Kalnins et al. 2003
• aQtree (watercolor) – Luft et al. 2007
Related Work
[Kalnins et al. 2003]
[Luft et al. 2007]
36. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Specialized frameworks
• Jot (linework) – Kalnins et al. 2003
• aQtree (watercolor) – Luft et al. 2007
• Freestyle (linework) – Grabli et al. 2010
Related Work
[Kalnins et al. 2003]
[Luft et al. 2007] [Grabli et al. 2010]
37. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Specialized frameworks
• Jot (linework) – Kalnins et al. 2003
• aQtree (watercolor) – Luft et al. 2007
• Freestyle (linework) – Grabli et al. 2010
• Meander (linework) – Whited et al. (Disney)
Related Work
[Kalnins et al. 2003]
[Luft et al. 2007] [Grabli et al. 2010] [Disney 2013]
38. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Generalized NPR frameworks
Related Work
39. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Generalized NPR frameworks
• OPENNPAR – Halper et al. 2002, 2003
Related Work
[Halper et al. 2002, 2003]
40. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Generalized NPR frameworks
• OPENNPAR – Halper et al. 2002, 2003
• RenderBots – Schlechtweg et al. 2005
Related Work
[Halper et al. 2002, 2003] [Schlechtweg et al. 2005]
41. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Generalized NPR frameworks
• OPENNPAR – Halper et al. 2002, 2003
• RenderBots – Schlechtweg et al. 2005
• Example-based – Bénard et al. 2013, Fišer et al. 2016
Related Work
[Halper et al. 2002, 2003] [Schlechtweg et al. 2005] [Fišer et al. 2016]
42. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Closest to our work
Related Work
43. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Closest to our work
• 2D XML based framework – Semmo et al. 2016
• Default presets
• Global parameters
• Local adjustments
Related Work
44. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Closest to our work
• 2D XML based framework – Semmo et al. 2016
• Default presets
• Global parameters
• Local adjustments
Related Work
[Semmo et al. 2003]
45. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations
Related Work
46. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations
• Proprietary
Related Work
47. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations
• Proprietary
• Standalone applications
Related Work
48. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations
• Proprietary
• Standalone applications
• Narrow spectrum of control
Related Work
49. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations
• Proprietary
• Standalone applications
• Narrow spectrum of control
• No cross-stylization
Related Work
50. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Practical answer, in the context of
interactive filter-based stylization of 3D
computer graphics
Related Work
51. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
MNPR: Expressive NPR Framework
“In the same way that scientists concocted new colors and artisans crafted new
tools for painters to enable their visions, we need to provide these to 3D artists.”
52. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Contents
• Levels of Control
• Cross-stylization
• Discussion and QA
MNPR: Expressive NPR Framework
53. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Levels of Control
Covering the interaction spectrum in 3D
54. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Interaction spectrum
Levels of Control
[Isenberg 2016]
55. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Why?
Interaction spectrum
Levels of Control
[Isenberg 2016]
56. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Why?
• Augments the artistic workflow
Interaction spectrum
Levels of Control
[Isenberg 2016]
57. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Why?
• Augments the artistic workflow
• Maximizes the potential
Interaction spectrum
Levels of Control
[Isenberg 2016]
58. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Why?
• Augments the artistic workflow
• Maximizes the potential
• Highlights intrinsic problems
Interaction spectrum
Levels of Control
[Isenberg 2016]
59. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Levels of Control
60. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Style presets Global control
Levels of Control
61. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Style presets Global control
Levels of Control
62. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Style presets
• Defines the stylization
Global control
Levels of Control
63. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Style presets
• Defines the stylization
• Save/load global control presets
Global control
Levels of Control
64. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Style presets
• Defines the stylization
• Save/load global control presets
Global control
Levels of Control
• Defines the style
65. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Style presets
• Defines the stylization
• Save/load global control presets
Global control
Levels of Control
• Defines the style
• Effect variables
66. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Levels of Control
3D model by Slava Zhuravlev
67. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Material presets Material control
Levels of Control
68. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Material presets Material control
Levels of Control
69. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Material presets
• Save/load material attributes
and procedural parameters
Material control
Levels of Control
70. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Material presets
• Save/load material attributes
and procedural parameters
• Modifies underlying shaders
Material control
Levels of Control
71. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Material presets
• Save/load material attributes
and procedural parameters
• Modifies underlying shaders
Material control
Levels of Control
• Defines the material attributes
72. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Material presets
• Save/load material attributes
and procedural parameters
• Modifies underlying shaders
Material control
Levels of Control
• Defines the material attributes
• Drives procedural effect
parameters
73. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Slava Zhuravlev
74. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Slava Zhuravlev
75. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Mapped control
Levels of Control
76. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Mapped control
Levels of Control
77. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Locally control effects
Mapped control
Levels of Control
78. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Locally control effects
• Painting effect parameters onto 3D objects
Mapped control
Levels of Control
79. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Locally control effects
• Painting effect parameters onto 3D objects
• Most versatile, but time consuming
Mapped control
Levels of Control
80. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Locally control effects
• Painting effect parameters onto 3D objects
• Most versatile, but time consuming
Mapped control
Levels of Control
Procedural pigment density
< 5 sec.
Mapped pigment density
> 40 sec.
81. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Slava Zhuravlev
82. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Slava Zhuravlev
83. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Proxy control
Levels of Control 3D model by Slava Zhuravlev
84. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Standalone “invisible” stylization objects within the scene
Proxy control
Levels of Control 3D model by Slava Zhuravlev
85. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Standalone “invisible” stylization objects within the scene
• Arbitrary representation: high and low level of control
Proxy control
Levels of Control 3D model by Slava Zhuravlev
86. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
• Standalone “invisible” stylization objects within the scene
• Arbitrary representation: high and low level of control
• Supports procedural and local effect parameters
Proxy control
Levels of Control 3D model by Slava Zhuravlev
87. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Live demo, levels of control
Real-time showcase
88. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D scene by one of our test users
89. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D scene by one of our test users
90. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
What is happening in the back-end?
Levels of Control
91. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
What is happening in the back-end?
Stylization maps!
Levels of Control
92. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Levels of Control
N M
P Mfx
3D model by Slava Zhuravlev
93. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Slava Zhuravlev
94. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
3D model by Slava Zhuravlev
95. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Levels of Control
96. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
Levels of Control
97. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
Levels of Control
User study
98. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
Levels of Control
User study
99. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
• Watercolor style
Levels of Control
User study
100. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
• Watercolor style
• ~25 mins of tutorials
Levels of Control
User study
101. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
• Watercolor style
• ~25 mins of tutorials
• 2+ hours experimenting
Levels of Control
User study
102. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
• Watercolor style
• ~25 mins of tutorials
• 2+ hours experimenting
• Online questionnaire
Levels of Control
User study
103. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
• Watercolor style
• ~25 mins of tutorials
• 2+ hours experimenting
• Online questionnaire
• 20 responses
Levels of Control
User study
104. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating usefulness
• 65 participants
• Watercolor style
• ~25 mins of tutorials
• 2+ hours experimenting
• Online questionnaire
• 20 responses
no
experience
minimal
experience
limited
experience
experienced
highly
experienced
traditional
watercolor
NPR
solutions
Autodesk
Maya
0
5
1 0
1 5
20
25
30
35
Experience in CG
(in years)in:
1
2
3
4
5
Levels of Control
User study
105. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Evaluating each Level of Control
not useful
of little use
sometimes
useful
useful
highly
useful
Style presets
& global control
Material presets
& material control
Material effect
control
Mapped
control
Proxy
control
0
1 0
20
30
40
50
60
Test duration
(in hours)
1
2
3
4
5
Levels of Control
106. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Overall satisfaction
strongly
disagree
disagree
neutral
agree
strongly
agree
stylistic goal
achieved
range of control
beneficial
future use
of the system
1
2
3
4
5
Levels of Control
107. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization
Generalizing control semantics
108. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
NPR is not expressive by itself
Paintings by Dylan Scott Pierce
109. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Enabling cross-stylization
Cross-stylization 3D model by Slava Zhuravlev
110. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Enabling cross-stylization
Cross-stylization
• Stylization driven by effect parameters
3D model by Slava Zhuravlev
111. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Enabling cross-stylization
Cross-stylization
• Stylization driven by effect parameters
• All effect parameters stored in stylization maps
3D model by Slava Zhuravlev
112. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Enabling cross-stylization
Cross-stylization
• Stylization driven by effect parameters
• All effect parameters stored in stylization maps
Is there any correlation between effects in different styles?
3D model by Slava Zhuravlev
113. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Effect categories
Cross-stylization
• Pigment-based effects
• Substrate-based effects
• Edge-based effects
• Abstraction-based effects
114. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Effect categories
Cross-stylization
• Pigment-based effects
• Substrate-based effects
• Edge-based effects
• Abstraction-based effects
Why not correlate effects in these categories?
Each category contained in one stylization map
115. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Stylization control semantics
Cross-stylization
• Sufficiently generic
• Semantically meaningful
• Adhere to a defined control scheme
116. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Stylization control semantics
Cross-stylization
• Sufficiently generic
• Semantically meaningful
• Adhere to a defined control scheme
117. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Pigment-based effects
Cross-stylization
118. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Pigment-based effects
Cross-stylization
• Pigment variation
• Variation towards one or another color in compound pigments
119. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Pigment-based effects
Cross-stylization
• Pigment variation
• Variation towards one or another color in compound pigments
• Pigment application
• Placement of the pigments over the substrate
120. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Pigment-based effects
Cross-stylization
• Pigment variation
• Variation towards one or another color in compound pigments
• Pigment application
• Placement of the pigments over the substrate
• Pigment density
• Concentration of pigments
121. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Substrate-based effects
Cross-stylization
122. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Substrate-based effects
Cross-stylization
• Substrate distortion
• Distortion of the subject by the substrate
123. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Substrate-based effects
Cross-stylization
• Substrate distortion
• Distortion of the subject by the substrate
• U- and V-inclination
• U and V offset in patterns (generalized from substrate inclination)
124. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Edge-based effects
Cross-stylization
125. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Edge-based effects
Cross-stylization
• Edge intensity
• Strength/intensity/darkness of an edge
126. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Edge-based effects
Cross-stylization
• Edge intensity
• Strength/intensity/darkness of an edge
• Edge width
• Thickness of an edge
127. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Edge-based effects
Cross-stylization
• Edge intensity
• Strength/intensity/darkness of an edge
• Edge width
• Thickness of an edge
• Edge transition
• Transition between edges
128. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Abstraction-based effects
Cross-stylization
129. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Abstraction-based effects
Cross-stylization
• Detail
• Detail of subject
130. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Abstraction-based effects
Cross-stylization
• Detail
• Detail of subject
• Shape
• Abstraction/distortion of shapes
131. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Abstraction-based effects
Cross-stylization
• Detail
• Detail of subject
• Shape
• Abstraction/distortion of shapes
• Blending
• Blending of colors
132. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization 3D model by Julien Kaspar
133. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization
Viewport 2.0 Watercolor Oil paint Charcoal
3D model by Julien Kaspar
134. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Live demo, cross-stylization
Real-time showcase
135. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization 3D model by Stevie Brown
136. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization 3D model by Stevie Brown
137. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization 3D model by Stevie Brown
138. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization 3D model by Stevie Brown
139. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization incompatibilities
Conclusion
Oil paint
3D model by Black Spire Studio
140. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization incompatibilities
Conclusion
Oil paint Charcoal
3D model by Black Spire Studio
141. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Cross-stylization incompatibilities
Conclusion
Oil paint Charcoal Charcoal with material adjustment
3D model by Black Spire Studio
142. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Implementation
In a nutshell
143. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Implementation
144. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Creating a stylization pipeline
Implementation
145. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Creating a stylization pipeline
Implementation
• Define and create attributes for global effect parameters (C++)
146. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Creating a stylization pipeline
Implementation
• Define and create attributes for global effect parameters (C++)
• Outline the custom stylization pipeline (C++)
147. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Creating a stylization pipeline
Implementation
• Define and create attributes for global effect parameters (C++)
• Outline the custom stylization pipeline (C++)
• Define NoiseFX and PaintFX controls (Python)
148. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Creating a stylization pipeline
Implementation
• Define and create attributes for global effect parameters (C++)
• Outline the custom stylization pipeline (C++)
• Define NoiseFX and PaintFX controls (Python)
• Follow the control semantics and scheme
149. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Creating a stylization pipeline
Implementation
• Define and create attributes for global effect parameters (C++)
• Outline the custom stylization pipeline (C++)
• Define NoiseFX and PaintFX controls (Python)
• Follow the control semantics and scheme
Refer to the paper and the source code for more implementation details.
150. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion
It’s time…
151. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
Conclusion
152. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
Conclusion
153. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
• Motion coherence of substrate-based effects
Conclusion
154. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
• Motion coherence of substrate-based effects
• Maya limitations
Conclusion
155. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
• Motion coherence of substrate-based effects
• Maya limitations
• Hardware limitations
Conclusion
156. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
• Motion coherence of substrate-based effects
• Maya limitations
• Hardware limitations
• Art-directed tools can be further explored/improved (proxies)
Conclusion
157. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
• Motion coherence of substrate-based effects
• Maya limitations
• Hardware limitations
• Art-directed tools can be further explored/improved (proxies)
• Stylization control semantics can be further refined
Conclusion
158. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Limitations and Future Work
• MNPR may not be production ready
• Motion coherence of substrate-based effects
• Maya limitations
• Hardware limitations
• Art-directed tools can be further explored/improved (proxies)
• Stylization control semantics can be further refined
• Potential cross-stylization incompatibilities
Conclusion
159. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
Conclusion
160. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
• Expressive Non-Photorealistic Rendering Framework
Conclusion
161. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
• Expressive Non-Photorealistic Rendering Framework
• Covering the interaction spectrum
Conclusion
162. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
• Expressive Non-Photorealistic Rendering Framework
• Covering the interaction spectrum
• Useful tools for artists at each level of control
Conclusion
163. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
• Expressive Non-Photorealistic Rendering Framework
• Covering the interaction spectrum
• Useful tools for artists at each level of control
• Stylization control semantics for predictable cross-stylization
with watercolor, oil paint and charcoal styles
Conclusion
164. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
• Expressive Non-Photorealistic Rendering Framework
• Covering the interaction spectrum
• Useful tools for artists at each level of control
• Stylization control semantics for predictable cross-stylization
with watercolor, oil paint and charcoal styles
• Using Autodesk Maya as a development framework
Conclusion
165. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
Conclusion: MNPR
• Expressive Non-Photorealistic Rendering Framework
• Covering the interaction spectrum
• Useful tools for artists at each level of control
• Stylization control semantics for predictable cross-stylization
with watercolor, oil paint and charcoal styles
• Using Autodesk Maya as a development framework
• Open-sourcing the framework to facilitate further
development and use by artists/engineers/researchers.
Conclusion
166. MNPR: A Framework for Real-Time Expressive Non-Photorealistic Rendering of 3D Computer Graphics
made by Brian Horgan
https://vimeo.com/285085957