No fear refactoring in the DVCS age: Enter SemanticMerge.
These are the slides of my talk at QCon 2013.
http://www.youtube.com/watch?v=GJuHtNZaong&feature=share&list=UUT5MZm9TcduZFstmQp9vRXQ&index=3
The document provides steps for getting started contributing to the Mozilla Developer Network (MDN) by creating an account, choosing a task like editing an existing article, completing the task without worrying about perfection, and joining the MDN community to ask questions. MDN's mission is to be a learning platform and provide complete documentation for open web technologies, whether supported by Mozilla or not.
This document is a presentation about migrating from Subversion to Git version control systems. It discusses Subversion and its limitations, introduces the SVK system as an intermediate step, and focuses on Git as a more advanced replacement. The presentation covers key features of Git like branching, merging, and how git-svn can be used to integrate a Git repository with a Subversion server. It encourages migrating to Git for improved distributed version control capabilities.
These are the slides used for the ReSharper + SemanticMerge webinar run with JetBrains on June 17th 2014.
http://info.jetbrains.com/Webinar_ReSharper-SemanticMerge-Registration.html
This document discusses SemanticMerge, a 3-way merge tool that handles merging at the structure level rather than textblock level. It parses code into intermediate trees to calculate diffs and find conflicts between code revisions. This enables extreme refactoring by simplifying code merges. SemanticMerge helps teams simplify today's merges and enables new development scenarios like cleaning up code through moving methods and splitting classes while easily merging changes.
Code quality is important to ensure code is easy to understand, maintain and extend. Key aspects of code quality include having high test coverage, following design principles like SOLID, and using metrics like maintainability index and technical practices such as pair programming, code reviews, refactoring and test-driven development. Tools like Visual Studio and Resharper can help analyze code quality metrics and identify areas for improvement.
Writing code that works and writing code that other people can read and understand are two different skills. And writing code that other people can read and understand became more and more essential skills as the project grows larger, and more people start working on it.
But because it is a skill, you need to train it consciously. It's a lot like writing essays and books. Everybody can write letters and words; many can also connect them in grammatically correct sentences. But not everybody is a J. R. R. R. Tolkien and have their books read by everyone.
An essential part of learning this skill is reading and analyzing other people's code on the one hand. And making other people read your code and give you feedback about it.
The speaker will talk about different methods of how to make programmers better writers and how to work out the skill of writing code that other people will want to read.
Does Git make you angry inside? In this workshop you will get a gentle introduction to working efficiently as a Web developer in small teams, or as a solo developer. We'll focus on real world examples you can actually use to make your work faster and more efficient. Windows? OSX? Linux? No problem, we'll get you up and running with Git, no matter what your system. Yes, this is an introductory session. This is for people who feel shame that they don't know how to "clone my github project", wish they too could "get the gist", and get mad when people say "just diff me a patch" as if it's something as easy as making a mai thai even though you have no rum. No, you don't have to have git installed to attend. You don't even need to know where the command line is on your computer.
This document summarizes a presentation by Dr. S. Ducasse on dedicated tools and research for software business intelligence at Tisoca 2014. It discusses:
- The need for dedicated tools tailored to specific problems to aid in maintenance, decision making, and reducing costs.
- The Moose technology for building custom analysis tools through its language-independent meta-model and ability to import different data sources.
- Examples of how analysis tools built with Moose have helped companies with challenges like migration, reverse engineering, and decision support.
- The benefits of an inventive toolkit approach that allows building multi-level dashboards, code analyzers, impact analyzers, and other custom tools to address specific
The document provides steps for getting started contributing to the Mozilla Developer Network (MDN) by creating an account, choosing a task like editing an existing article, completing the task without worrying about perfection, and joining the MDN community to ask questions. MDN's mission is to be a learning platform and provide complete documentation for open web technologies, whether supported by Mozilla or not.
This document is a presentation about migrating from Subversion to Git version control systems. It discusses Subversion and its limitations, introduces the SVK system as an intermediate step, and focuses on Git as a more advanced replacement. The presentation covers key features of Git like branching, merging, and how git-svn can be used to integrate a Git repository with a Subversion server. It encourages migrating to Git for improved distributed version control capabilities.
These are the slides used for the ReSharper + SemanticMerge webinar run with JetBrains on June 17th 2014.
http://info.jetbrains.com/Webinar_ReSharper-SemanticMerge-Registration.html
This document discusses SemanticMerge, a 3-way merge tool that handles merging at the structure level rather than textblock level. It parses code into intermediate trees to calculate diffs and find conflicts between code revisions. This enables extreme refactoring by simplifying code merges. SemanticMerge helps teams simplify today's merges and enables new development scenarios like cleaning up code through moving methods and splitting classes while easily merging changes.
Code quality is important to ensure code is easy to understand, maintain and extend. Key aspects of code quality include having high test coverage, following design principles like SOLID, and using metrics like maintainability index and technical practices such as pair programming, code reviews, refactoring and test-driven development. Tools like Visual Studio and Resharper can help analyze code quality metrics and identify areas for improvement.
Writing code that works and writing code that other people can read and understand are two different skills. And writing code that other people can read and understand became more and more essential skills as the project grows larger, and more people start working on it.
But because it is a skill, you need to train it consciously. It's a lot like writing essays and books. Everybody can write letters and words; many can also connect them in grammatically correct sentences. But not everybody is a J. R. R. R. Tolkien and have their books read by everyone.
An essential part of learning this skill is reading and analyzing other people's code on the one hand. And making other people read your code and give you feedback about it.
The speaker will talk about different methods of how to make programmers better writers and how to work out the skill of writing code that other people will want to read.
Does Git make you angry inside? In this workshop you will get a gentle introduction to working efficiently as a Web developer in small teams, or as a solo developer. We'll focus on real world examples you can actually use to make your work faster and more efficient. Windows? OSX? Linux? No problem, we'll get you up and running with Git, no matter what your system. Yes, this is an introductory session. This is for people who feel shame that they don't know how to "clone my github project", wish they too could "get the gist", and get mad when people say "just diff me a patch" as if it's something as easy as making a mai thai even though you have no rum. No, you don't have to have git installed to attend. You don't even need to know where the command line is on your computer.
This document summarizes a presentation by Dr. S. Ducasse on dedicated tools and research for software business intelligence at Tisoca 2014. It discusses:
- The need for dedicated tools tailored to specific problems to aid in maintenance, decision making, and reducing costs.
- The Moose technology for building custom analysis tools through its language-independent meta-model and ability to import different data sources.
- Examples of how analysis tools built with Moose have helped companies with challenges like migration, reverse engineering, and decision support.
- The benefits of an inventive toolkit approach that allows building multi-level dashboards, code analyzers, impact analyzers, and other custom tools to address specific
1. Code refactoring involves changing the internal structure of code to improve its understandability and maintainability without changing its external behavior.
2. Refactoring techniques include extracting methods, inline methods, managing temporary variables, simplifying conditionals, and moving features between objects.
3. Refactoring should be done regularly in small steps to avoid bugs, improve readability and design, and facilitate future changes, but it is important to avoid over-refactoring or refactoring close to deadlines.
Software Modeling and Artificial Intelligence: friends or foes?Jordi Cabot
(1) Modeling and AI can be both friends and foes, depending on how they are used together.
(2) Model-driven engineering (MDE) approaches can help make AI systems like chatbots and machine learning pipelines more rigorous, robust, and interoperable by applying modeling principles.
(3) AI techniques like machine learning and deep learning also have the potential to enhance MDE, for example by enabling automated model transformations and smarter modeling tools with features like autocomplete.
The Modlet Pattern proposes organizing front-end code into self-contained modules called "modlets" that each contain related source code, templates, styles, documentation, tests, and demos. This structure enforces good design principles like separation of concerns and loose coupling. It also provides benefits like simplifying mental models, easing code maintenance and collaboration, and streamlining development processes. The document discusses challenges of file organization and proposes solutions using tools like StealJS, DocumentJS, and Testee. It suggests publishing modlets as reusable npm packages.
More information, visit: http://www.godatadriven.com/accelerator.html
Data scientists aren’t a nice-to-have anymore, they are a must-have. Businesses of all sizes are scooping up this new breed of engineering professional. But how do you find the right one for your business?
The Data Science Accelerator Program is a one year program, delivered in Amsterdam by world-class industry practitioners. It provides your aspiring data scientists with intensive on- and off-site instruction, access to an extensive network of speakers and mentors and coaching.
The Data Science Accelerator Program helps you assess and radically develop the skills of your data science staff or recruits.
Our goal is to deliver you excellent data scientists that help you become a data driven enterprise.
The right tools
We teach your organisation the proven data science tools.
The right hands
We are trusted by many industry leading partners.
The right experience
We've done big data and data science at many clients, we know what the real world is like.
The right experts
We have a world class selection of lecturers that you will be working with.
Vincent D. Warmerdam
Jonathan Samoocha
Ivo Everts
Rogier van der Geer
Ron van Weverwijk
Giovanni Lanzani
The right curriculum
We meet twice a month. Once for a lecture, once for a hackathon.
Lectures
The RStudio stack.
The art of simulation.
The iPython stack.
Linear modelling.
Operations research.
Nonlinear modelling.
Clustering & ensemble methods.
Natural language processing.
Time series.
Visualisation.
Scaling to big data.
Advanced topics.
Hackathons
Scrape and mine the internet.
Solving multiarmed bandit problems.
Webdev with flask and pandas as a backend.
Build an automation script for linear models.
Build a heuristic tsp solver.
Code review your automation for nonlinear models.
Build a method that outperforms random forests.
Build a markov chain to generate song lyrics.
Predict an optimal portfolio for the stock market.
Create an interactive d3 app with backend.
Start up a spark cluster with large s3 data.
You pick!
Interested?
Ping us here. signal@godatadriven.com
The document discusses various aspects of refactoring code including underengineering, overengineering, code smells, Cunningham's metaphor of design debt, when to refactor, how to refactor using small behavior-preserving transformations and test-driven development, the relationship between patterns and refactorings, and important readings on refactoring. The overall message is that refactoring helps improve code quality by removing duplication, simplifying code, and clarifying intent through a process of continuous small changes while preserving behavior.
Improving the accuracy and reliability of data analysis codeJohan Carlin
1) The document discusses improving the accuracy and reliability of data analysis code through testing and version control. It emphasizes that reliable code is well-documented, generalizable beyond specific datasets, and includes tests to verify functionality.
2) Common approaches to testing include null simulations to calculate error rates and parameter recovery tests to confirm models can learn known weights. Version control through Git or SVN provides a record of code states over time.
3) The document argues that testing makes sense for scientific computing given demands on accuracy and risks of errors influencing research. Tests can target hypotheses, analysis methods, and experiment scripts.
This document contains frequently asked questions (FAQs) about big data technologies like Hadoop, MongoDB, and related topics. Key topics covered include using Hadoop for processing large datasets, MongoDB features and administration, optimizing web crawlers, performing clustering on large datasets, and comparing algorithms like logistic regression, decision trees, and neural networks. Configuration parameters for Hadoop like dfs.name.dir and dfs.data.dir are also discussed.
Moving to Microservices with the Help of Distributed TracesKP Kaiser
Moving away from a monolith to a microservices architecture is a process fraught with hidden challenges. There's legacy code, infrastructure, and organizational processes that all need to change, in order to make the switch successful.
But microservices come with a huge increase in infrastructure complexity. We'll see how distributed traces empower developers to work with greater autonomy, in increasingly complex deployment environments.
How to do code review and use analysis tool in software developmentMitosis Technology
Code Inspection is a phase of the software development process to find and correct the errors in the functional and non-functional area in the early stage.
This document provides an overview of VBA skills at level 3, including working with variables, loops, arrays, conditional statements, range and dictionary objects, subroutines, functions, and userforms. It also discusses using ADO to connect VBA to backend databases, reading and writing CSV files without Excel using FreeFile, and introducing classes in VBA for object-oriented programming. The document concludes with examples of how VBA has been used to automate risk analysis and reporting processes by cutting out manual labor and streamlining multiple people's work into single-click solutions.
Based on my observations, in IT we suffer from continuous collective amnesia and we are even proud of it.
For at least 50 years meanwhile, we struggle how to build systems, that are easy to understand, to maintain, to change and to operate in a reliable way. Each time we hit the wall again, we start to look for a new silver bullet on the horizon, strongly believing that it will solve the problem for good.
The key word is "new": "New" is good in our community, while "old" is bad, worthless, crap. We suffer from youthism, not only in recruiting, but in all areas. This way we discard any "old" knowledge, no matter if it is valuable or not. We separate by age, not by value.
Additionally we continuously lose our collective memory with every new generation that leaves university as they are also taught not to value anything old and instead only look for the new, shiny stuff.
While not all old knowledge is worth being preserved, admittedly, there is still a lot of valuable old knowledge available, offering answers to the problems that we face today - creating maintainable and reliable systems, dealing with distribution and tackling complexity, just to name a few of the challenges.
This presentation is a journey through some (very) old computer science papers that contain a lot of very valuable knowledge regarding the problems we face today. For each of the papers, some of the key ideas are presented and how they address our current challenges.
Of course, the voice track is missing and there are a lot more papers that would be worth being mentioned in this presentation. Still, I hope that also the slides alone will be of some value for you - and convince you a bit that not everything "old" in IT is automatically worthless ... ;)
The document discusses several key principles of software engineering:
1. Modularity - Systems should be composed of independent modules that can be developed and reused independently.
2. Abstraction - Complexity is managed by abstracting away unnecessary details and focusing on essential aspects.
3. Separation of concerns - Different aspects of a problem are separated, so each can be addressed independently.
2013 Twin Cities Drupal Camp - No CSS Needed: A Sitebuilders' Guide to ThemingTara King
This document discusses how modules can help with theming in Drupal without needing to learn CSS. It recommends several modules that can assist with common theming tasks like color, typography, layout and images. Color and Font Your Face allow changing theme colors and fonts easily. Display Suite provides a drag and drop interface for layouts. Gallery Formatter turns image fields into galleries. While theming directly in code is difficult, modules get users 90% of the way to a theme. The document encourages learning from the large Drupal community for help with remaining challenges.
Workshop - The Little Pattern That Could.pdfTobiasGoeschel
The document discusses refactoring a monolithic application to follow Domain-Driven Design (DDD) and microservice principles. It provides exercises and hints to guide refactoring the codebase to use Hexagonal Architecture with separated domains, commands and queries using CQRS, and persistence-oriented repositories. Later exercises discuss improving test speed by isolating dependencies and refactoring for a serverless architecture by splitting the application into individual use cases and replacing the in-memory repository.
Modular plugins are plugins that can be modified and extended without changing core plugin code. They utilize WordPress methods like add_action(), do_action(), add_filter(), and apply_filters() to make their components extensible by themes or other plugins. While most plugins are not built modularly, making a plugin modular allows other developers to customize and build upon it, increasing its usefulness and creating happier users. Modular design follows WordPress's own extensible architecture.
How I Learned to Stop Worrying and Love Legacy Code - Ox:Agile 2018Mike Harris
I never wrote it; everybody else did! How many times have you waded through an ageing, decaying, tangled forrest of code and wished it would just die? How many times have you heard someone say that what really needs to happen is a complete rewrite? I have heard this many times, and, have uttered that fatal sentence myself. But shouldn’t we love our legacy code? Doesn’t it represent our investment and the hard work of ourselves and our predecessors? Throwing it away is dangerous, because, before we do, we’ll need to work out exactly what it does, and we’ll need to tweeze out that critical business logic nestled in a deeply entangled knot of IF statements. It could take us years to do, and we’ll have to maintain two systems whilst we do it, inevitably adding new features to them both. Yes we get to reimplement using the latest, coolest programming language, instead of an old behemoth, but how long will our new cool language be around, and who will maintain that code, when it itself inevitably turns to legacy? We can throw our arms in the air, complaining and grumbling about how we didn’t write the code, how we would never have written it the way it is, how those that wrote it were lesser programmers, possibly lesser humans themselves, but the code still remains, staring us in the face and hanging around for longer that we could possibly imagine. We can sort it out, we can improve it, we can make it testable, and we can learn to love our legacy code.
https://www.youtube.com/watch?v=qRP45l5UugE
This document introduces design patterns and provides an overview of the Singleton pattern. It discusses different ways to implement the Singleton pattern, including eager initialization, static block initialization, lazy initialization, thread-safe implementations using synchronization and double-checked locking, and Bill Pugh's inner static class approach. It also notes that reflection can be used to circumvent Singleton implementations and destroy the singleton nature of the class. The document categorizes design patterns and provides examples of creational, structural, and behavioral patterns.
You are a clever and talented person. You create beautiful designs, or perhaps you can architect a system that even a cat could use. Your peers adore you. Your clients love you. But (until now) you haven't *&^#^ been able to make Git bend to your will. It makes you angry inside that you have to ask your co-worker, again, for that *&^#^ command to share your work.
It's not you. It's Git. Promise.
We'll kick off this session with an explanation of why Git is so freaking hard to learn. Then we'll flip the tables and make YOU (not Git) the centre of attention. You'll learn how to define, and sketch out how version control works, using terms and scenarios that make sense to you. Yup, sketch. On paper. (Tablets and other electronic devices will be allowed, as long as you promise not to get distracted choosing the perfect shade for rage.) To this diagram you'll layer on the common Git commands that are used regularly by efficient Git-using teams. It'll be the ultimate cheat sheet, and specific to your job. If you think this sounds complicated, it's not! Your fearless leader, Emma Jane, has been successfully teaching people how-to-tech for over a decade. She is well known for her non-technical metaphors which ease learners into complex, work-related topics that previously felt inaccessible.
Yes, this is an introductory session. No, you don't have to have Git installed to attend. You don't even need to know where the command line is on your computer. Yes, you should attend if you've been embarrassed to ask team-mates what Git command you used three weeks ago to upload your work...just in case you're supposed to remember.
If you're a super-human Git fanatic who is frustrated by people who don't just "git it", this session is also for you. You'll learn new ways to effectively communicate your ever-loving Git, and you may develop a deeper understanding of why your previous attempts to explain Git have failed.
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
1. Code refactoring involves changing the internal structure of code to improve its understandability and maintainability without changing its external behavior.
2. Refactoring techniques include extracting methods, inline methods, managing temporary variables, simplifying conditionals, and moving features between objects.
3. Refactoring should be done regularly in small steps to avoid bugs, improve readability and design, and facilitate future changes, but it is important to avoid over-refactoring or refactoring close to deadlines.
Software Modeling and Artificial Intelligence: friends or foes?Jordi Cabot
(1) Modeling and AI can be both friends and foes, depending on how they are used together.
(2) Model-driven engineering (MDE) approaches can help make AI systems like chatbots and machine learning pipelines more rigorous, robust, and interoperable by applying modeling principles.
(3) AI techniques like machine learning and deep learning also have the potential to enhance MDE, for example by enabling automated model transformations and smarter modeling tools with features like autocomplete.
The Modlet Pattern proposes organizing front-end code into self-contained modules called "modlets" that each contain related source code, templates, styles, documentation, tests, and demos. This structure enforces good design principles like separation of concerns and loose coupling. It also provides benefits like simplifying mental models, easing code maintenance and collaboration, and streamlining development processes. The document discusses challenges of file organization and proposes solutions using tools like StealJS, DocumentJS, and Testee. It suggests publishing modlets as reusable npm packages.
More information, visit: http://www.godatadriven.com/accelerator.html
Data scientists aren’t a nice-to-have anymore, they are a must-have. Businesses of all sizes are scooping up this new breed of engineering professional. But how do you find the right one for your business?
The Data Science Accelerator Program is a one year program, delivered in Amsterdam by world-class industry practitioners. It provides your aspiring data scientists with intensive on- and off-site instruction, access to an extensive network of speakers and mentors and coaching.
The Data Science Accelerator Program helps you assess and radically develop the skills of your data science staff or recruits.
Our goal is to deliver you excellent data scientists that help you become a data driven enterprise.
The right tools
We teach your organisation the proven data science tools.
The right hands
We are trusted by many industry leading partners.
The right experience
We've done big data and data science at many clients, we know what the real world is like.
The right experts
We have a world class selection of lecturers that you will be working with.
Vincent D. Warmerdam
Jonathan Samoocha
Ivo Everts
Rogier van der Geer
Ron van Weverwijk
Giovanni Lanzani
The right curriculum
We meet twice a month. Once for a lecture, once for a hackathon.
Lectures
The RStudio stack.
The art of simulation.
The iPython stack.
Linear modelling.
Operations research.
Nonlinear modelling.
Clustering & ensemble methods.
Natural language processing.
Time series.
Visualisation.
Scaling to big data.
Advanced topics.
Hackathons
Scrape and mine the internet.
Solving multiarmed bandit problems.
Webdev with flask and pandas as a backend.
Build an automation script for linear models.
Build a heuristic tsp solver.
Code review your automation for nonlinear models.
Build a method that outperforms random forests.
Build a markov chain to generate song lyrics.
Predict an optimal portfolio for the stock market.
Create an interactive d3 app with backend.
Start up a spark cluster with large s3 data.
You pick!
Interested?
Ping us here. signal@godatadriven.com
The document discusses various aspects of refactoring code including underengineering, overengineering, code smells, Cunningham's metaphor of design debt, when to refactor, how to refactor using small behavior-preserving transformations and test-driven development, the relationship between patterns and refactorings, and important readings on refactoring. The overall message is that refactoring helps improve code quality by removing duplication, simplifying code, and clarifying intent through a process of continuous small changes while preserving behavior.
Improving the accuracy and reliability of data analysis codeJohan Carlin
1) The document discusses improving the accuracy and reliability of data analysis code through testing and version control. It emphasizes that reliable code is well-documented, generalizable beyond specific datasets, and includes tests to verify functionality.
2) Common approaches to testing include null simulations to calculate error rates and parameter recovery tests to confirm models can learn known weights. Version control through Git or SVN provides a record of code states over time.
3) The document argues that testing makes sense for scientific computing given demands on accuracy and risks of errors influencing research. Tests can target hypotheses, analysis methods, and experiment scripts.
This document contains frequently asked questions (FAQs) about big data technologies like Hadoop, MongoDB, and related topics. Key topics covered include using Hadoop for processing large datasets, MongoDB features and administration, optimizing web crawlers, performing clustering on large datasets, and comparing algorithms like logistic regression, decision trees, and neural networks. Configuration parameters for Hadoop like dfs.name.dir and dfs.data.dir are also discussed.
Moving to Microservices with the Help of Distributed TracesKP Kaiser
Moving away from a monolith to a microservices architecture is a process fraught with hidden challenges. There's legacy code, infrastructure, and organizational processes that all need to change, in order to make the switch successful.
But microservices come with a huge increase in infrastructure complexity. We'll see how distributed traces empower developers to work with greater autonomy, in increasingly complex deployment environments.
How to do code review and use analysis tool in software developmentMitosis Technology
Code Inspection is a phase of the software development process to find and correct the errors in the functional and non-functional area in the early stage.
This document provides an overview of VBA skills at level 3, including working with variables, loops, arrays, conditional statements, range and dictionary objects, subroutines, functions, and userforms. It also discusses using ADO to connect VBA to backend databases, reading and writing CSV files without Excel using FreeFile, and introducing classes in VBA for object-oriented programming. The document concludes with examples of how VBA has been used to automate risk analysis and reporting processes by cutting out manual labor and streamlining multiple people's work into single-click solutions.
Based on my observations, in IT we suffer from continuous collective amnesia and we are even proud of it.
For at least 50 years meanwhile, we struggle how to build systems, that are easy to understand, to maintain, to change and to operate in a reliable way. Each time we hit the wall again, we start to look for a new silver bullet on the horizon, strongly believing that it will solve the problem for good.
The key word is "new": "New" is good in our community, while "old" is bad, worthless, crap. We suffer from youthism, not only in recruiting, but in all areas. This way we discard any "old" knowledge, no matter if it is valuable or not. We separate by age, not by value.
Additionally we continuously lose our collective memory with every new generation that leaves university as they are also taught not to value anything old and instead only look for the new, shiny stuff.
While not all old knowledge is worth being preserved, admittedly, there is still a lot of valuable old knowledge available, offering answers to the problems that we face today - creating maintainable and reliable systems, dealing with distribution and tackling complexity, just to name a few of the challenges.
This presentation is a journey through some (very) old computer science papers that contain a lot of very valuable knowledge regarding the problems we face today. For each of the papers, some of the key ideas are presented and how they address our current challenges.
Of course, the voice track is missing and there are a lot more papers that would be worth being mentioned in this presentation. Still, I hope that also the slides alone will be of some value for you - and convince you a bit that not everything "old" in IT is automatically worthless ... ;)
The document discusses several key principles of software engineering:
1. Modularity - Systems should be composed of independent modules that can be developed and reused independently.
2. Abstraction - Complexity is managed by abstracting away unnecessary details and focusing on essential aspects.
3. Separation of concerns - Different aspects of a problem are separated, so each can be addressed independently.
2013 Twin Cities Drupal Camp - No CSS Needed: A Sitebuilders' Guide to ThemingTara King
This document discusses how modules can help with theming in Drupal without needing to learn CSS. It recommends several modules that can assist with common theming tasks like color, typography, layout and images. Color and Font Your Face allow changing theme colors and fonts easily. Display Suite provides a drag and drop interface for layouts. Gallery Formatter turns image fields into galleries. While theming directly in code is difficult, modules get users 90% of the way to a theme. The document encourages learning from the large Drupal community for help with remaining challenges.
Workshop - The Little Pattern That Could.pdfTobiasGoeschel
The document discusses refactoring a monolithic application to follow Domain-Driven Design (DDD) and microservice principles. It provides exercises and hints to guide refactoring the codebase to use Hexagonal Architecture with separated domains, commands and queries using CQRS, and persistence-oriented repositories. Later exercises discuss improving test speed by isolating dependencies and refactoring for a serverless architecture by splitting the application into individual use cases and replacing the in-memory repository.
Modular plugins are plugins that can be modified and extended without changing core plugin code. They utilize WordPress methods like add_action(), do_action(), add_filter(), and apply_filters() to make their components extensible by themes or other plugins. While most plugins are not built modularly, making a plugin modular allows other developers to customize and build upon it, increasing its usefulness and creating happier users. Modular design follows WordPress's own extensible architecture.
How I Learned to Stop Worrying and Love Legacy Code - Ox:Agile 2018Mike Harris
I never wrote it; everybody else did! How many times have you waded through an ageing, decaying, tangled forrest of code and wished it would just die? How many times have you heard someone say that what really needs to happen is a complete rewrite? I have heard this many times, and, have uttered that fatal sentence myself. But shouldn’t we love our legacy code? Doesn’t it represent our investment and the hard work of ourselves and our predecessors? Throwing it away is dangerous, because, before we do, we’ll need to work out exactly what it does, and we’ll need to tweeze out that critical business logic nestled in a deeply entangled knot of IF statements. It could take us years to do, and we’ll have to maintain two systems whilst we do it, inevitably adding new features to them both. Yes we get to reimplement using the latest, coolest programming language, instead of an old behemoth, but how long will our new cool language be around, and who will maintain that code, when it itself inevitably turns to legacy? We can throw our arms in the air, complaining and grumbling about how we didn’t write the code, how we would never have written it the way it is, how those that wrote it were lesser programmers, possibly lesser humans themselves, but the code still remains, staring us in the face and hanging around for longer that we could possibly imagine. We can sort it out, we can improve it, we can make it testable, and we can learn to love our legacy code.
https://www.youtube.com/watch?v=qRP45l5UugE
This document introduces design patterns and provides an overview of the Singleton pattern. It discusses different ways to implement the Singleton pattern, including eager initialization, static block initialization, lazy initialization, thread-safe implementations using synchronization and double-checked locking, and Bill Pugh's inner static class approach. It also notes that reflection can be used to circumvent Singleton implementations and destroy the singleton nature of the class. The document categorizes design patterns and provides examples of creational, structural, and behavioral patterns.
You are a clever and talented person. You create beautiful designs, or perhaps you can architect a system that even a cat could use. Your peers adore you. Your clients love you. But (until now) you haven't *&^#^ been able to make Git bend to your will. It makes you angry inside that you have to ask your co-worker, again, for that *&^#^ command to share your work.
It's not you. It's Git. Promise.
We'll kick off this session with an explanation of why Git is so freaking hard to learn. Then we'll flip the tables and make YOU (not Git) the centre of attention. You'll learn how to define, and sketch out how version control works, using terms and scenarios that make sense to you. Yup, sketch. On paper. (Tablets and other electronic devices will be allowed, as long as you promise not to get distracted choosing the perfect shade for rage.) To this diagram you'll layer on the common Git commands that are used regularly by efficient Git-using teams. It'll be the ultimate cheat sheet, and specific to your job. If you think this sounds complicated, it's not! Your fearless leader, Emma Jane, has been successfully teaching people how-to-tech for over a decade. She is well known for her non-technical metaphors which ease learners into complex, work-related topics that previously felt inaccessible.
Yes, this is an introductory session. No, you don't have to have Git installed to attend. You don't even need to know where the command line is on your computer. Yes, you should attend if you've been embarrassed to ask team-mates what Git command you used three weeks ago to upload your work...just in case you're supposed to remember.
If you're a super-human Git fanatic who is frustrated by people who don't just "git it", this session is also for you. You'll learn new ways to effectively communicate your ever-loving Git, and you may develop a deeper understanding of why your previous attempts to explain Git have failed.
Similar to Semantic Merge - No fear refactoring (20)
Skybuffer SAM4U tool for SAP license adoptionTatiana Kojar
Manage and optimize your license adoption and consumption with SAM4U, an SAP free customer software asset management tool.
SAM4U, an SAP complimentary software asset management tool for customers, delivers a detailed and well-structured overview of license inventory and usage with a user-friendly interface. We offer a hosted, cost-effective, and performance-optimized SAM4U setup in the Skybuffer Cloud environment. You retain ownership of the system and data, while we manage the ABAP 7.58 infrastructure, ensuring fixed Total Cost of Ownership (TCO) and exceptional services through the SAP Fiori interface.
Digital Marketing Trends in 2024 | Guide for Staying AheadWask
https://www.wask.co/ebooks/digital-marketing-trends-in-2024
Feeling lost in the digital marketing whirlwind of 2024? Technology is changing, consumer habits are evolving, and staying ahead of the curve feels like a never-ending pursuit. This e-book is your compass. Dive into actionable insights to handle the complexities of modern marketing. From hyper-personalization to the power of user-generated content, learn how to build long-term relationships with your audience and unlock the secrets to success in the ever-shifting digital landscape.
HCL Notes und Domino Lizenzkostenreduzierung in der Welt von DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-und-domino-lizenzkostenreduzierung-in-der-welt-von-dlau/
DLAU und die Lizenzen nach dem CCB- und CCX-Modell sind für viele in der HCL-Community seit letztem Jahr ein heißes Thema. Als Notes- oder Domino-Kunde haben Sie vielleicht mit unerwartet hohen Benutzerzahlen und Lizenzgebühren zu kämpfen. Sie fragen sich vielleicht, wie diese neue Art der Lizenzierung funktioniert und welchen Nutzen sie Ihnen bringt. Vor allem wollen Sie sicherlich Ihr Budget einhalten und Kosten sparen, wo immer möglich. Das verstehen wir und wir möchten Ihnen dabei helfen!
Wir erklären Ihnen, wie Sie häufige Konfigurationsprobleme lösen können, die dazu führen können, dass mehr Benutzer gezählt werden als nötig, und wie Sie überflüssige oder ungenutzte Konten identifizieren und entfernen können, um Geld zu sparen. Es gibt auch einige Ansätze, die zu unnötigen Ausgaben führen können, z. B. wenn ein Personendokument anstelle eines Mail-Ins für geteilte Mailboxen verwendet wird. Wir zeigen Ihnen solche Fälle und deren Lösungen. Und natürlich erklären wir Ihnen das neue Lizenzmodell.
Nehmen Sie an diesem Webinar teil, bei dem HCL-Ambassador Marc Thomas und Gastredner Franz Walder Ihnen diese neue Welt näherbringen. Es vermittelt Ihnen die Tools und das Know-how, um den Überblick zu bewahren. Sie werden in der Lage sein, Ihre Kosten durch eine optimierte Domino-Konfiguration zu reduzieren und auch in Zukunft gering zu halten.
Diese Themen werden behandelt
- Reduzierung der Lizenzkosten durch Auffinden und Beheben von Fehlkonfigurationen und überflüssigen Konten
- Wie funktionieren CCB- und CCX-Lizenzen wirklich?
- Verstehen des DLAU-Tools und wie man es am besten nutzt
- Tipps für häufige Problembereiche, wie z. B. Team-Postfächer, Funktions-/Testbenutzer usw.
- Praxisbeispiele und Best Practices zum sofortigen Umsetzen
Best 20 SEO Techniques To Improve Website Visibility In SERPPixlogix Infotech
Boost your website's visibility with proven SEO techniques! Our latest blog dives into essential strategies to enhance your online presence, increase traffic, and rank higher on search engines. From keyword optimization to quality content creation, learn how to make your site stand out in the crowded digital landscape. Discover actionable tips and expert insights to elevate your SEO game.
Ivanti’s Patch Tuesday breakdown goes beyond patching your applications and brings you the intelligence and guidance needed to prioritize where to focus your attention first. Catch early analysis on our Ivanti blog, then join industry expert Chris Goettl for the Patch Tuesday Webinar Event. There we’ll do a deep dive into each of the bulletins and give guidance on the risks associated with the newly-identified vulnerabilities.
Programming Foundation Models with DSPy - Meetup SlidesZilliz
Prompting language models is hard, while programming language models is easy. In this talk, I will discuss the state-of-the-art framework DSPy for programming foundation models with its powerful optimizers and runtime constraint system.
TrustArc Webinar - 2024 Global Privacy SurveyTrustArc
How does your privacy program stack up against your peers? What challenges are privacy teams tackling and prioritizing in 2024?
In the fifth annual Global Privacy Benchmarks Survey, we asked over 1,800 global privacy professionals and business executives to share their perspectives on the current state of privacy inside and outside of their organizations. This year’s report focused on emerging areas of importance for privacy and compliance professionals, including considerations and implications of Artificial Intelligence (AI) technologies, building brand trust, and different approaches for achieving higher privacy competence scores.
See how organizational priorities and strategic approaches to data security and privacy are evolving around the globe.
This webinar will review:
- The top 10 privacy insights from the fifth annual Global Privacy Benchmarks Survey
- The top challenges for privacy leaders, practitioners, and organizations in 2024
- Key themes to consider in developing and maintaining your privacy program
Ocean lotus Threat actors project by John Sitima 2024 (1).pptxSitimaJohn
Ocean Lotus cyber threat actors represent a sophisticated, persistent, and politically motivated group that poses a significant risk to organizations and individuals in the Southeast Asian region. Their continuous evolution and adaptability underscore the need for robust cybersecurity measures and international cooperation to identify and mitigate the threats posed by such advanced persistent threat groups.
GraphRAG for Life Science to increase LLM accuracyTomaz Bratanic
GraphRAG for life science domain, where you retriever information from biomedical knowledge graphs using LLMs to increase the accuracy and performance of generated answers
HCL Notes and Domino License Cost Reduction in the World of DLAUpanagenda
Webinar Recording: https://www.panagenda.com/webinars/hcl-notes-and-domino-license-cost-reduction-in-the-world-of-dlau/
The introduction of DLAU and the CCB & CCX licensing model caused quite a stir in the HCL community. As a Notes and Domino customer, you may have faced challenges with unexpected user counts and license costs. You probably have questions on how this new licensing approach works and how to benefit from it. Most importantly, you likely have budget constraints and want to save money where possible. Don’t worry, we can help with all of this!
We’ll show you how to fix common misconfigurations that cause higher-than-expected user counts, and how to identify accounts which you can deactivate to save money. There are also frequent patterns that can cause unnecessary cost, like using a person document instead of a mail-in for shared mailboxes. We’ll provide examples and solutions for those as well. And naturally we’ll explain the new licensing model.
Join HCL Ambassador Marc Thomas in this webinar with a special guest appearance from Franz Walder. It will give you the tools and know-how to stay on top of what is going on with Domino licensing. You will be able lower your cost through an optimized configuration and keep it low going forward.
These topics will be covered
- Reducing license cost by finding and fixing misconfigurations and superfluous accounts
- How do CCB and CCX licenses really work?
- Understanding the DLAU tool and how to best utilize it
- Tips for common problem areas, like team mailboxes, functional/test users, etc
- Practical examples and best practices to implement right away
Salesforce Integration for Bonterra Impact Management (fka Social Solutions A...Jeffrey Haguewood
Sidekick Solutions uses Bonterra Impact Management (fka Social Solutions Apricot) and automation solutions to integrate data for business workflows.
We believe integration and automation are essential to user experience and the promise of efficient work through technology. Automation is the critical ingredient to realizing that full vision. We develop integration products and services for Bonterra Case Management software to support the deployment of automations for a variety of use cases.
This video focuses on integration of Salesforce with Bonterra Impact Management.
Interested in deploying an integration with Salesforce for Bonterra Impact Management? Contact us at sales@sidekicksolutionsllc.com to discuss next steps.
Nunit vs XUnit vs MSTest Differences Between These Unit Testing Frameworks.pdfflufftailshop
When it comes to unit testing in the .NET ecosystem, developers have a wide range of options available. Among the most popular choices are NUnit, XUnit, and MSTest. These unit testing frameworks provide essential tools and features to help ensure the quality and reliability of code. However, understanding the differences between these frameworks is crucial for selecting the most suitable one for your projects.
Taking AI to the Next Level in Manufacturing.pdfssuserfac0301
Read Taking AI to the Next Level in Manufacturing to gain insights on AI adoption in the manufacturing industry, such as:
1. How quickly AI is being implemented in manufacturing.
2. Which barriers stand in the way of AI adoption.
3. How data quality and governance form the backbone of AI.
4. Organizational processes and structures that may inhibit effective AI adoption.
6. Ideas and approaches to help build your organization's AI strategy.
2. No fear refactoring in the DVCS age
Enter SemanticMerge
pablo santos @psluaces
@semanticmerge
meet us at booth #14
3. Agenda
• How to diff and merge refactored code with
SemanticMerge
• Under the hood: the inner workings of the tool
• Next step: multi-file SemanticMerge
• Towards Semantic version control
4. Tech from the 80’s
• Git and the DVCS pack came in 2005
• They went mainstream (GitHub)
• They all can do awesome merges
• But they rely on old-fashioned diff and merge tools
6. You can do amazing things with DVCS
• It has greatly improved merge tracking
• It does a great job finding the contributors for the 3-way
merge
• But at the end of the day… it invokes an external 3-way
merge tool to solve the merge
10. What will a conventional 3-way merge tool do?
• No conflict!
• And two methods!!
11. What will a conventional 3-way merge tool do?
• No conflict!
• And two methods!!
• It simply finds two blocks of text
being added and the original
block being deleted… Doesn’t
care about the code structure
13. What is
semanticmerge?
I guess you all know by now ;-)
• It is a 3-way merge tool (handles src, dst and base).
• It is refactor-aware and programming languageaware.
• Handles merging at the structure level and not
textblock level.
• It means:
• It first parses the code – creates intermediate trees.
• Then calculates diff pairs: base-src, base-dst.
• Then looks for conflicts between pairs.
• Enabling eXtreme Refactoring was always the goal :-)
14. How can
semanticmerge affect
development?
We were always motivated by “the cost of
change” and how refactoring can greatly
help keeping code quality high and reducing
maintenance costs.
SemanticMerge is all about helping teams to
clean up the code, keep it readable and
simple, without restrictions: do it in
parallel, fine!
15. How can
semanticmerge affect
development?
1) It helps simplifying the merges TODAY
already.
2) But more importantly it enables new
scenarios that you’re not doing today: clean
up the code, move methods, split classes…
and just be able to merge it back.
18. Detect conflicts that regular tools can’t
• What if the same method is modified concurrently at
different lines?
• Semantic detects the case and can force the conflict
resolution to be manual – a regular text based merge tool can’t do that
because it doesn’t have the context
20. Creating a tree-like view of the code
using System [1]
using System.Text [2]
namespace Sample [4-26]
class Math [6-25]
int Add(int a, int b) [8-12]
int Mul(int a, int b) [14-18]
int Subst(int a, int b) [20-24]
22. Some complex cases – cyclic move
base
namespace Test
class Socket
class Utils
class DNS
namespace Test
class DNS
class Socket
class Utils
source
namespace Test
class Socket
class Utils
class DNS
destination
23. Evil twin
base
namespace Test
class Socket
method Connect
source
destination
namespace Test
namespace Test
class Socket
class Socket
method Connect
method Connect
method Send
method Send
Two methods with exactly the same
signature can’t be added on the same
location -> conflict
25. Next steps
Check UserVoice -
http://plasticscm.uservoice.com
• JavaScript is the top request
• C/C++ - Objective-C
• XML
• Ruby, Scala…
• Mac OS X support
• External Parsers (Delphi already there)
26. • Semantic code review – each time you
Next steps
review code you would like to go straight to the point…
• Semantic blame/annotate – calculate
the blame considering methods
• Semantic Method History
• Semantic repository stats