In this article I'm going to discuss a problem few people think of. Computer simulation of various processes becomes more and more widespread. This technology is wonderful because it allows us to save time and materials which would be otherwise spent on senseless chemical, biological, physical and other kinds of experiments. A computer simulation model of a wing section flow may help significantly reduce the number of prototypes to be tested in a real wind tunnel. Numerical experiments are given more and more trust nowadays. However, dazzled by the triumph of computer simulation, nobody notices the problem of software complexity growth behind it. People treat computer and computer programs just as a means to obtain necessary results. I'm worried that very few know and care about the fact that software size growth leads to a non-linear growth of the number of software bugs. It's dangerous to exploit a computer treating it just as a big calculator. So, that's what I think - I need to share this idea with other people.
I read a post recently about a check of the LibRaw project performed by Coverity SCAN. It stated that nothing interesting had been found. So I decided to try our analyzer PVS-Studio on it.
KDE (abbreviation for K Desktop Environment) is a desktop environment primarily for Linux and other
UNIX-like operating systems. To put it simple, it's the thing which is responsible for the entire graphic
design. The environment is based on the cross-platform user interface development toolkit Qt. The
development is done by several hundreds of programmers throughout the world devoted to the idea of
free software. KDE offers a complete set of user environment applications that allows one to interact
with the operating system within the framework of a modern graphic interface. So let's see what KDE
has under the hood.
Checking the Cross-Platform Framework Cocos2d-xAndrey Karpov
Cocos2d is an open source software framework. It can be used to build games, apps and other cross-platform GUI based interactive programs. Cocos2d contains many branches with the best known being Cocos2d-Swift, Cocos2d-x, Cocos2d-html5 and Cocos2d-XNA.
In this article, we are going to discuss results of the check of Cocos2d-x, the framework for C++, done by PVS-Studio 5.18. The project is pretty high-quality, but there are still some issues to consider. The source code was downloaded from GitHub.
I know I promised not to touch upon the topic of 3DO console emulators anymore - well, sorry for breaking that promise. You see, I've recently had an opportunity to try such an exotic thing as a static code analyzer - PVS-Studio, to be exact. The first project I decided to try it on was, naturally, my 3DO console emulator (Phoenix Project). It was the first 32-bit console with a CD drive, dating back to the beginning of the 90-s. Dad bought it in Moscow as a present for me and my brother - and I've been fond of it since then :-). And since I've got such an opportunity, why not check all the other 3DO emulators too? Here we go...
In this article I'm going to show you some examples explaining why physicists developing software products to be used in their field should also use static code analysis tools. I would be glad to see PVS-Studio in this role, but any other analyzer would do as well, of course. A code analyzer can significantly reduce the debugging time and headache from silly mistakes. Isn't it better when you can focus on physics rather than waste time seeking and fixing bugs in C++ applications?
Monitoring a program that monitors computer networksAndrey Karpov
There exists the NetXMS project, which is a software product designed to monitor computer systems and networks. It can be used to monitor the whole IT-infrastructure, from SNMP-compatible devices to server software. And I am naturally going to monitor the code of this project with the PVS-Studio analyzer.
The Chromium browser is developing very fast. When we checked the solution for the first time in 2011, it included 473 projects. Now it includes 1169 projects. We were curious to know if Google developers had managed to keep the highest quality of their code with Chromium developing at such a fast rate. Well, they had.
64-bit computers have been around and well for a long time already. Most applications have 64-bit versions that can benefit from larger memory capacity and improved performance thanks to the architectural capabilities of 64-bit processors. Developing 64-bit application in C/C++ requires much attention from a programmer. There is a number of reasons for 32-bit code to fail to work properly when recompiled for the 64-bit platform. There are a lot of articles on this subject, so we will focus on some other thing. Let's find out if the new features introduced in C++11 have made 64-bit software programmers' life any better and easier.
Note. The article was originally published in Software Developer's Journal (April 25, 2014) and is published here by the editors' permission.
I read a post recently about a check of the LibRaw project performed by Coverity SCAN. It stated that nothing interesting had been found. So I decided to try our analyzer PVS-Studio on it.
KDE (abbreviation for K Desktop Environment) is a desktop environment primarily for Linux and other
UNIX-like operating systems. To put it simple, it's the thing which is responsible for the entire graphic
design. The environment is based on the cross-platform user interface development toolkit Qt. The
development is done by several hundreds of programmers throughout the world devoted to the idea of
free software. KDE offers a complete set of user environment applications that allows one to interact
with the operating system within the framework of a modern graphic interface. So let's see what KDE
has under the hood.
Checking the Cross-Platform Framework Cocos2d-xAndrey Karpov
Cocos2d is an open source software framework. It can be used to build games, apps and other cross-platform GUI based interactive programs. Cocos2d contains many branches with the best known being Cocos2d-Swift, Cocos2d-x, Cocos2d-html5 and Cocos2d-XNA.
In this article, we are going to discuss results of the check of Cocos2d-x, the framework for C++, done by PVS-Studio 5.18. The project is pretty high-quality, but there are still some issues to consider. The source code was downloaded from GitHub.
I know I promised not to touch upon the topic of 3DO console emulators anymore - well, sorry for breaking that promise. You see, I've recently had an opportunity to try such an exotic thing as a static code analyzer - PVS-Studio, to be exact. The first project I decided to try it on was, naturally, my 3DO console emulator (Phoenix Project). It was the first 32-bit console with a CD drive, dating back to the beginning of the 90-s. Dad bought it in Moscow as a present for me and my brother - and I've been fond of it since then :-). And since I've got such an opportunity, why not check all the other 3DO emulators too? Here we go...
In this article I'm going to show you some examples explaining why physicists developing software products to be used in their field should also use static code analysis tools. I would be glad to see PVS-Studio in this role, but any other analyzer would do as well, of course. A code analyzer can significantly reduce the debugging time and headache from silly mistakes. Isn't it better when you can focus on physics rather than waste time seeking and fixing bugs in C++ applications?
Monitoring a program that monitors computer networksAndrey Karpov
There exists the NetXMS project, which is a software product designed to monitor computer systems and networks. It can be used to monitor the whole IT-infrastructure, from SNMP-compatible devices to server software. And I am naturally going to monitor the code of this project with the PVS-Studio analyzer.
The Chromium browser is developing very fast. When we checked the solution for the first time in 2011, it included 473 projects. Now it includes 1169 projects. We were curious to know if Google developers had managed to keep the highest quality of their code with Chromium developing at such a fast rate. Well, they had.
64-bit computers have been around and well for a long time already. Most applications have 64-bit versions that can benefit from larger memory capacity and improved performance thanks to the architectural capabilities of 64-bit processors. Developing 64-bit application in C/C++ requires much attention from a programmer. There is a number of reasons for 32-bit code to fail to work properly when recompiled for the 64-bit platform. There are a lot of articles on this subject, so we will focus on some other thing. Let's find out if the new features introduced in C++11 have made 64-bit software programmers' life any better and easier.
Note. The article was originally published in Software Developer's Journal (April 25, 2014) and is published here by the editors' permission.
We thought of checking the Boost library long ago but were not sure if we would collect enough results to write an article. However, the wish remained. We tried to do that twice but gave up each time because we didn't know how to replace a compiler call with a PVS-Studio.exe call. Now we've got us new arms, and the third attempt has been successful. So, are there any bugs to be found in Boost?
Firefox Easily Analyzed by PVS-Studio StandaloneAndrey Karpov
We already checked Mozilla Firefox with the PVS-Studio analyzer three years ago. It was pretty inconvenient and troublesome at the time. You see, there is no Visual Studio project file for Firefox – the build is done with the help of makefiles. That's why you can't just take and check the project. We had to integrate PVS-Studio into the build system, which appeared a difficult task. If I remember it rightly, it all resulted in successfully analyzing only a part of the project. But everything is different now that we have PVS-Studio Standalone. We can now monitoring all compiler launches and easily check the project.
Just recently I've checked the VirtualDub project with PVS-Studio. This was a random choice. You see, I believe that it is very important to regularly check and re-check various projects to show users that the PVS-Studio analyzer is evolving, and which project you run it on doesn't matter that much - bugs can be found everywhere. We already checked the VirtualDub project in 2011, but we found almost nothing of interest then. So, I decided to take a look at it now, 2 years later.
Any large modern application consists of numerous third-party libraries, and I'd like to discuss the topic of our trust in these libraries. In books and articles, there are lots of debates about code quality, testing methods, development methodologies, and so on. But I don't remember anyone discussing the quality of bricks applications are built from. So let's talk about it today. For example, there is the Medicine Insight Segmentation and Registration Toolkit (ITK). I find it to be implemented pretty well. At least, I have noticed just a few bugs in its code. But I cannot say the same about the code of the third-party libraries used there. So the question is: how much can we trust such systems? Much food for thought.
Good has won this time. To be more exact, source codes of the Chromium project have won. Chromium is one of the best projects we have checked with PVS-Studio.
How to make fewer errors at the stage of code writing. Part N1Andrey Karpov
I've arrived at the source code of a widely know instant messenger Miranda IM. Together with various plugins, this is a rather large project whose size is about 950 thousand code lines in C and C++. And like any other considerable project with a long development history, it has rather many errors and misprints.
Consequences of using the Copy-Paste method in C++ programming and how to dea...Andrey Karpov
I create the PVS-Studio analyzer detecting errors in source code of C/C++/C++0x software. So I have to review a large amount of source code of various applications where we detected suspicious code fragments with the help of PVS-Studio. I have collected a lot of examples demonstrating that an error occurred because of copying and modifying a code fragment. Of course, it has been known for a long time that using Copy-Paste in programming is a bad thing. But let's try to investigate this problem closely instead of limiting ourselves to just saying "do not copy the code".
We thought of checking the Boost library long ago but were not sure if we would collect enough results to write an article. However, the wish remained. We tried to do that twice but gave up each time because we didn't know how to replace a compiler call with a PVS-Studio.exe call. Now we've got us new arms, and the third attempt has been successful. So, are there any bugs to be found in Boost?
Firefox Easily Analyzed by PVS-Studio StandaloneAndrey Karpov
We already checked Mozilla Firefox with the PVS-Studio analyzer three years ago. It was pretty inconvenient and troublesome at the time. You see, there is no Visual Studio project file for Firefox – the build is done with the help of makefiles. That's why you can't just take and check the project. We had to integrate PVS-Studio into the build system, which appeared a difficult task. If I remember it rightly, it all resulted in successfully analyzing only a part of the project. But everything is different now that we have PVS-Studio Standalone. We can now monitoring all compiler launches and easily check the project.
Just recently I've checked the VirtualDub project with PVS-Studio. This was a random choice. You see, I believe that it is very important to regularly check and re-check various projects to show users that the PVS-Studio analyzer is evolving, and which project you run it on doesn't matter that much - bugs can be found everywhere. We already checked the VirtualDub project in 2011, but we found almost nothing of interest then. So, I decided to take a look at it now, 2 years later.
Any large modern application consists of numerous third-party libraries, and I'd like to discuss the topic of our trust in these libraries. In books and articles, there are lots of debates about code quality, testing methods, development methodologies, and so on. But I don't remember anyone discussing the quality of bricks applications are built from. So let's talk about it today. For example, there is the Medicine Insight Segmentation and Registration Toolkit (ITK). I find it to be implemented pretty well. At least, I have noticed just a few bugs in its code. But I cannot say the same about the code of the third-party libraries used there. So the question is: how much can we trust such systems? Much food for thought.
Good has won this time. To be more exact, source codes of the Chromium project have won. Chromium is one of the best projects we have checked with PVS-Studio.
How to make fewer errors at the stage of code writing. Part N1Andrey Karpov
I've arrived at the source code of a widely know instant messenger Miranda IM. Together with various plugins, this is a rather large project whose size is about 950 thousand code lines in C and C++. And like any other considerable project with a long development history, it has rather many errors and misprints.
Consequences of using the Copy-Paste method in C++ programming and how to dea...Andrey Karpov
I create the PVS-Studio analyzer detecting errors in source code of C/C++/C++0x software. So I have to review a large amount of source code of various applications where we detected suspicious code fragments with the help of PVS-Studio. I have collected a lot of examples demonstrating that an error occurred because of copying and modifying a code fragment. Of course, it has been known for a long time that using Copy-Paste in programming is a bad thing. But let's try to investigate this problem closely instead of limiting ourselves to just saying "do not copy the code".
Здесь вы найдёте 60 вредных советов для программистов и пояснение, почему они вредные. Всё будет одновременно в шутку и серьёзно. Как бы глупо ни смотрелся вредный совет, он не выдуман, а подсмотрен в реальном мире программирования.
In this article, you're going to find 60 terrible coding tips — and explanations of why they are terrible. It's a fun and serious piece at the same time. No matter how terrible these tips look, they aren't fiction, they are real: we saw them all in the real programming world.
Ошибки, которые сложно заметить на code review, но которые находятся статичес...Andrey Karpov
Есть ошибки, которые легко прячутся от программистов на обзорах кода. Чаще всего они связаны с опечатками или недостаточным знанием тонких нюансах языка/библиотеки. Давайте посмотрим интересные примеры таких ошибок и как их можно выявить с помощью статического анализа. При этом анализаторы не конкурируют с обзорами кода или, например, юнит-тестами. Они отлично дополняют другие методологии борьбы с ошибками.
When should you start using PVS-Studio? What can PVS-Studio detect? Supported standards: MISRA, CWE, CERT, OWASP, AUTOSAR. What about analysis options? What about legacy code?
Двойное освобождение ресурсов. Недостижимый код. Некорректные операции сдвига. Неправильная работа с типами. Опечатки и copy-paste. Проблемы безопасности. Путаница с приоритетом операций.
Make Your and Other Programmer’s Life Easier with Static Analysis (Unreal Eng...Andrey Karpov
What is static analysis and what is it for? How does static analysis work? (Unreal Engine 4). How to introduce static analysis in your project: best practices.
Does static analysis need machine learning?Andrey Karpov
Introduction to static analysis. Existing solutions and approaches they implement. Problems and pitfalls when creating an analyzer. When learning «manually». When learning on a real large code base. Most promising approaches.
Typical errors in code on the example of C++, C#, and JavaAndrey Karpov
Objectives of this webinar
How we detected error patterns
Patterns themselves and how to avoid them:
3.1 Copy-paste and last line effect
3.2 if (A) {...} else if (A)
3.3 Errors in checks
3.4 Array index out of bounds
3.5 Operator precedence
3.6 Typos that are hard to spot
How to use static analysis properly
Conclusion
Q&A
How to Fix Hundreds of Bugs in Legacy Code and Not Die (Unreal Engine 4)Andrey Karpov
How to fight bugs in legacy code?
Should you do it at all?
What to do if there are hundreds or even thousands of errors?(that’s usually the case)
How to avoid spending a plethora of man-hours on this?
And still, how did you work with Unreal Engine?
C++ Code as Seen by a Hypercritical ReviewerAndrey Karpov
We all do code reviews. Who doesn't admit this – does it twice as often. C++ code reviewers look like a sapper. .. except that they can make a mistake more than once. But sometimes the consequences are painful . Brave code review world.
Static Code Analysis for Projects, Built on Unreal EngineAndrey Karpov
Why Do You Need Static Analysis? Detect errors early in the program development process. Get recommendations on code formatting. Check your spelling. Calculate various software metrics.
Are С and C++ Alive? Even More, IBM RPG Is! C and C++ Are Not Just for Old Systems. Are С and C++ Alive? Summary for C, C++. Embedded: C and С++ Are on the Rise.
Zero, one, two, Freddy's coming for youAndrey Karpov
This post continues the series of articles, which can well be called "horrors for developers". This time it will also touch upon a typical pattern of typos related to the usage of numbers 0, 1, 2. The language you're writing in doesn't really matter: it can be C, C++, C#, or Java. If you're using constants 0, 1, 2 or variables' names contain these numbers, most likely, Freddy will come to visit you at night. Go on, read and don't say we didn't warn you.
Developing Distributed High-performance Computing Capabilities of an Open Sci...Globus
COVID-19 had an unprecedented impact on scientific collaboration. The pandemic and its broad response from the scientific community has forged new relationships among public health practitioners, mathematical modelers, and scientific computing specialists, while revealing critical gaps in exploiting advanced computing systems to support urgent decision making. Informed by our team’s work in applying high-performance computing in support of public health decision makers during the COVID-19 pandemic, we present how Globus technologies are enabling the development of an open science platform for robust epidemic analysis, with the goal of collaborative, secure, distributed, on-demand, and fast time-to-solution analyses to support public health.
A Comprehensive Look at Generative AI in Retail App Testing.pdfkalichargn70th171
Traditional software testing methods are being challenged in retail, where customer expectations and technological advancements continually shape the landscape. Enter generative AI—a transformative subset of artificial intelligence technologies poised to revolutionize software testing.
In software engineering, the right architecture is essential for robust, scalable platforms. Wix has undergone a pivotal shift from event sourcing to a CRUD-based model for its microservices. This talk will chart the course of this pivotal journey.
Event sourcing, which records state changes as immutable events, provided robust auditing and "time travel" debugging for Wix Stores' microservices. Despite its benefits, the complexity it introduced in state management slowed development. Wix responded by adopting a simpler, unified CRUD model. This talk will explore the challenges of event sourcing and the advantages of Wix's new "CRUD on steroids" approach, which streamlines API integration and domain event management while preserving data integrity and system resilience.
Participants will gain valuable insights into Wix's strategies for ensuring atomicity in database updates and event production, as well as caching, materialization, and performance optimization techniques within a distributed system.
Join us to discover how Wix has mastered the art of balancing simplicity and extensibility, and learn how the re-adoption of the modest CRUD has turbocharged their development velocity, resilience, and scalability in a high-growth environment.
Climate Science Flows: Enabling Petabyte-Scale Climate Analysis with the Eart...Globus
The Earth System Grid Federation (ESGF) is a global network of data servers that archives and distributes the planet’s largest collection of Earth system model output for thousands of climate and environmental scientists worldwide. Many of these petabyte-scale data archives are located in proximity to large high-performance computing (HPC) or cloud computing resources, but the primary workflow for data users consists of transferring data, and applying computations on a different system. As a part of the ESGF 2.0 US project (funded by the United States Department of Energy Office of Science), we developed pre-defined data workflows, which can be run on-demand, capable of applying many data reduction and data analysis to the large ESGF data archives, transferring only the resultant analysis (ex. visualizations, smaller data files). In this talk, we will showcase a few of these workflows, highlighting how Globus Flows can be used for petabyte-scale climate analysis.
AI Pilot Review: The World’s First Virtual Assistant Marketing SuiteGoogle
AI Pilot Review: The World’s First Virtual Assistant Marketing Suite
👉👉 Click Here To Get More Info 👇👇
https://sumonreview.com/ai-pilot-review/
AI Pilot Review: Key Features
✅Deploy AI expert bots in Any Niche With Just A Click
✅With one keyword, generate complete funnels, websites, landing pages, and more.
✅More than 85 AI features are included in the AI pilot.
✅No setup or configuration; use your voice (like Siri) to do whatever you want.
✅You Can Use AI Pilot To Create your version of AI Pilot And Charge People For It…
✅ZERO Manual Work With AI Pilot. Never write, Design, Or Code Again.
✅ZERO Limits On Features Or Usages
✅Use Our AI-powered Traffic To Get Hundreds Of Customers
✅No Complicated Setup: Get Up And Running In 2 Minutes
✅99.99% Up-Time Guaranteed
✅30 Days Money-Back Guarantee
✅ZERO Upfront Cost
See My Other Reviews Article:
(1) TubeTrivia AI Review: https://sumonreview.com/tubetrivia-ai-review
(2) SocioWave Review: https://sumonreview.com/sociowave-review
(3) AI Partner & Profit Review: https://sumonreview.com/ai-partner-profit-review
(4) AI Ebook Suite Review: https://sumonreview.com/ai-ebook-suite-review
Providing Globus Services to Users of JASMIN for Environmental Data AnalysisGlobus
JASMIN is the UK’s high-performance data analysis platform for environmental science, operated by STFC on behalf of the UK Natural Environment Research Council (NERC). In addition to its role in hosting the CEDA Archive (NERC’s long-term repository for climate, atmospheric science & Earth observation data in the UK), JASMIN provides a collaborative platform to a community of around 2,000 scientists in the UK and beyond, providing nearly 400 environmental science projects with working space, compute resources and tools to facilitate their work. High-performance data transfer into and out of JASMIN has always been a key feature, with many scientists bringing model outputs from supercomputers elsewhere in the UK, to analyse against observational or other model data in the CEDA Archive. A growing number of JASMIN users are now realising the benefits of using the Globus service to provide reliable and efficient data movement and other tasks in this and other contexts. Further use cases involve long-distance (intercontinental) transfers to and from JASMIN, and collecting results from a mobile atmospheric radar system, pushing data to JASMIN via a lightweight Globus deployment. We provide details of how Globus fits into our current infrastructure, our experience of the recent migration to GCSv5.4, and of our interest in developing use of the wider ecosystem of Globus services for the benefit of our user community.
Custom Healthcare Software for Managing Chronic Conditions and Remote Patient...Mind IT Systems
Healthcare providers often struggle with the complexities of chronic conditions and remote patient monitoring, as each patient requires personalized care and ongoing monitoring. Off-the-shelf solutions may not meet these diverse needs, leading to inefficiencies and gaps in care. It’s here, custom healthcare software offers a tailored solution, ensuring improved care and effectiveness.
Globus Connect Server Deep Dive - GlobusWorld 2024Globus
We explore the Globus Connect Server (GCS) architecture and experiment with advanced configuration options and use cases. This content is targeted at system administrators who are familiar with GCS and currently operate—or are planning to operate—broader deployments at their institution.
Understanding Globus Data Transfers with NetSageGlobus
NetSage is an open privacy-aware network measurement, analysis, and visualization service designed to help end-users visualize and reason about large data transfers. NetSage traditionally has used a combination of passive measurements, including SNMP and flow data, as well as active measurements, mainly perfSONAR, to provide longitudinal network performance data visualization. It has been deployed by dozens of networks world wide, and is supported domestically by the Engagement and Performance Operations Center (EPOC), NSF #2328479. We have recently expanded the NetSage data sources to include logs for Globus data transfers, following the same privacy-preserving approach as for Flow data. Using the logs for the Texas Advanced Computing Center (TACC) as an example, this talk will walk through several different example use cases that NetSage can answer, including: Who is using Globus to share data with my institution, and what kind of performance are they able to achieve? How many transfers has Globus supported for us? Which sites are we sharing the most data with, and how is that changing over time? How is my site using Globus to move data internally, and what kind of performance do we see for those transfers? What percentage of data transfers at my institution used Globus, and how did the overall data transfer performance compare to the Globus users?
How Recreation Management Software Can Streamline Your Operations.pptxwottaspaceseo
Recreation management software streamlines operations by automating key tasks such as scheduling, registration, and payment processing, reducing manual workload and errors. It provides centralized management of facilities, classes, and events, ensuring efficient resource allocation and facility usage. The software offers user-friendly online portals for easy access to bookings and program information, enhancing customer experience. Real-time reporting and data analytics deliver insights into attendance and preferences, aiding in strategic decision-making. Additionally, effective communication tools keep participants and staff informed with timely updates. Overall, recreation management software enhances efficiency, improves service delivery, and boosts customer satisfaction.
Quarkus Hidden and Forbidden ExtensionsMax Andersen
Quarkus has a vast extension ecosystem and is known for its subsonic and subatomic feature set. Some of these features are not as well known, and some extensions are less talked about, but that does not make them less interesting - quite the opposite.
Come join this talk to see some tips and tricks for using Quarkus and some of the lesser known features, extensions and development techniques.
Globus Compute wth IRI Workflows - GlobusWorld 2024Globus
As part of the DOE Integrated Research Infrastructure (IRI) program, NERSC at Lawrence Berkeley National Lab and ALCF at Argonne National Lab are working closely with General Atomics on accelerating the computing requirements of the DIII-D experiment. As part of the work the team is investigating ways to speedup the time to solution for many different parts of the DIII-D workflow including how they run jobs on HPC systems. One of these routes is looking at Globus Compute as a way to replace the current method for managing tasks and we describe a brief proof of concept showing how Globus Compute could help to schedule jobs and be a tool to connect compute at different facilities.
May Marketo Masterclass, London MUG May 22 2024.pdfAdele Miller
Can't make Adobe Summit in Vegas? No sweat because the EMEA Marketo Engage Champions are coming to London to share their Summit sessions, insights and more!
This is a MUG with a twist you don't want to miss.
TROUBLESHOOTING 9 TYPES OF OUTOFMEMORYERRORTier1 app
Even though at surface level ‘java.lang.OutOfMemoryError’ appears as one single error; underlyingly there are 9 types of OutOfMemoryError. Each type of OutOfMemoryError has different causes, diagnosis approaches and solutions. This session equips you with the knowledge, tools, and techniques needed to troubleshoot and conquer OutOfMemoryError in all its forms, ensuring smoother, more efficient Java applications.
Enterprise Resource Planning System includes various modules that reduce any business's workload. Additionally, it organizes the workflows, which drives towards enhancing productivity. Here are a detailed explanation of the ERP modules. Going through the points will help you understand how the software is changing the work dynamics.
To know more details here: https://blogs.nyggs.com/nyggs/enterprise-resource-planning-erp-system-modules/
SOCRadar Research Team: Latest Activities of IntelBrokerSOCRadar
The European Union Agency for Law Enforcement Cooperation (Europol) has suffered an alleged data breach after a notorious threat actor claimed to have exfiltrated data from its systems. Infamous data leaker IntelBroker posted on the even more infamous BreachForums hacking forum, saying that Europol suffered a data breach this month.
The alleged breach affected Europol agencies CCSE, EC3, Europol Platform for Experts, Law Enforcement Forum, and SIRIUS. Infiltration of these entities can disrupt ongoing investigations and compromise sensitive intelligence shared among international law enforcement agencies.
However, this is neither the first nor the last activity of IntekBroker. We have compiled for you what happened in the last few days. To track such hacker activities on dark web sources like hacker forums, private Telegram channels, and other hidden platforms where cyber threats often originate, you can check SOCRadar’s Dark Web News.
Stay Informed on Threat Actors’ Activity on the Dark Web with SOCRadar!
Experience our free, in-depth three-part Tendenci Platform Corporate Membership Management workshop series! In Session 1 on May 14th, 2024, we began with an Introduction and Setup, mastering the configuration of your Corporate Membership Module settings to establish membership types, applications, and more. Then, on May 16th, 2024, in Session 2, we focused on binding individual members to a Corporate Membership and Corporate Reps, teaching you how to add individual members and assign Corporate Representatives to manage dues, renewals, and associated members. Finally, on May 28th, 2024, in Session 3, we covered questions and concerns, addressing any queries or issues you may have.
For more Tendenci AMS events, check out www.tendenci.com/events
How to Position Your Globus Data Portal for Success Ten Good PracticesGlobus
Science gateways allow science and engineering communities to access shared data, software, computing services, and instruments. Science gateways have gained a lot of traction in the last twenty years, as evidenced by projects such as the Science Gateways Community Institute (SGCI) and the Center of Excellence on Science Gateways (SGX3) in the US, The Australian Research Data Commons (ARDC) and its platforms in Australia, and the projects around Virtual Research Environments in Europe. A few mature frameworks have evolved with their different strengths and foci and have been taken up by a larger community such as the Globus Data Portal, Hubzero, Tapis, and Galaxy. However, even when gateways are built on successful frameworks, they continue to face the challenges of ongoing maintenance costs and how to meet the ever-expanding needs of the community they serve with enhanced features. It is not uncommon that gateways with compelling use cases are nonetheless unable to get past the prototype phase and become a full production service, or if they do, they don't survive more than a couple of years. While there is no guaranteed pathway to success, it seems likely that for any gateway there is a need for a strong community and/or solid funding streams to create and sustain its success. With over twenty years of examples to draw from, this presentation goes into detail for ten factors common to successful and enduring gateways that effectively serve as best practices for any new or developing gateway.
How to Position Your Globus Data Portal for Success Ten Good Practices
The Big Calculator Gone Crazy
1. The Big Calculator Gone Crazy
Author: Andrey Karpov
Date: 05.09.2013
In this article I'm going to discuss a problem few people think of. Computer simulation of various processes
becomes more and more widespread. This technology is wonderful because it allows us to save time and
materials which would be otherwise spent on senseless chemical, biological, physical and other kinds of
experiments. A computer simulation model of a wing section flow may help significantly reduce the number
of prototypes to be tested in a real wind tunnel. Numerical experiments are given more and more trust
nowadays. However, dazzled by the triumph of computer simulation, nobody notices the problem of
software complexity growth behind it. People treat computer and computer programs just as a means to
obtain necessary results. I'm worried that very few know and care about the fact that software size growth
leads to a non-linear growth of the number of software bugs. It's dangerous to exploit a computer treating it
just as a big calculator. So, that's what I think - I need to share this idea with other people.
The Big Calculator
At first I intended to entitle this article something like "If programmers can't create medicines, why can
medics create programs?" Take some imaginary programmer - he is not allowed to develop and prepare
medicines. The reason is obvious: he doesn't have the necessary education for that. However, it's not that
simple with programming. It may seem that an imaginary medic who has learned how to program will by
default be a successful and useful programmer - especially given that a more or less acceptable
programming skill is much easier to master than organic chemistry and principles of medicine preparation.
Here lies a trap. A computer experiment requires as much care as a real one. Lab workers are taught to
wash test tubes after experiments and make sure they are sterile. But few really care about the problem of
some array accidentally remaining uninitialized.
2. Programmers are well aware that the more complex software is, the more complicated and subtle bugs
occur in it. In other words, I'm speaking of non-linear growth of the number of bugs accompanying the code
size growth. Programs performing chemical or any other scientific computations are far from being simple,
aren't they? Here's where the danger is. It's OK that a medic-programmer makes mistakes. Any
programmer, however skillful, makes them from time to time. What's not OK, people tend to trust these
results more and more. You calculate something and go on with your business.
Those engaged in programming as their professional activity know how dangerous this approach is. They
know what an undefined behavior is and how a program may only pretend to work well. There are numbers
of articles and books explaining how to correctly develop unit tests and ensure the correctness of
computations.
Such is the world of programmers. But the world of chemists/physicists/medics is not that way, I'm afraid.
They never write a complex program - they simply don't think that way. They use the computer as if it were
just a Big Calculator. This comparison was suggested by one of our readers. Let me quote his comment in
full here, so that English-speaking readers could learn about it too, once the article is translated.
I can tell you something on this subject from my own experience. Though being a professional programmer, I
actually come of a family of physicists and have physics education. At the moment when I had to choose
which university to enter, the call of blood was stronger than my faith in the bright future of IT. So, I entered
a physics university, rather prestigious on the local scale, which in fact is a "kindergarten" supervised by a
large research institute in my native city Nizhny Novgorod. People who know the subject will at once guess
which research institute and which university I mean.
While studying there, I quite naturally proved to be one of the best at programming (and mathematical
methods of physical modeling in particular). It was at the same time I also figured out the following things:
1. Physicists tend to view the computer as a large multi-functional calculator allowing you to draw a graph
of Eta versus Theta with Gamma going to infinity. As one can naturally expect, they are mainly interested in
the graph itself, not the program.
2. As a consequence of the first fact, a programmer is not viewed as a profession. A programmer is just the
guy who knows how to use the Big Calculator to draw the needed graph. They don't care which way it will be
done - at all. Sorry, what did you say? Static analysis? Version control? Oh, come on, guys! C++ is the
language of programmers; physicists write in FORTRAN!
3. As a consequence of the previous fact, anyone who's going to devote his life to writing programs to do
physical modeling, even all-purpose ones, even tough as hell ones, is but an appendix to the Big Calculator.
He's not even a person - just a kind of... By the way, it was not only me treated in such a way by physicists (I
was just an ordinary student, after all) - but even the best computer modeling specialist in the research
institute who taught a computational methods course at our university and who, when I turned to him as my
thesis adviser while writing my term paper, said to me almost straightforward, "They will despise you, so be
prepared to tolerate that".
I didn't want to tolerate that and after graduation left the computer modeling area for the field where
programmers are not thought to be untermenschen. I hope this example will help you understand why
3. initiatives like introducing static analysis even into relatively large (about 20 or 30 developers) projects on
computer modeling are a hopeless job. There simply might not be a person who knows what it is. And if such
a person does happen to be in the team, they'll most likely trample him because they don't need no trendy
programmer frills of yours. "We've been doing without them for a hundred years - and will do for more."
Here's another story for those not bored yet. My father, though being a pensioner, still works at a very large
defense engineering enterprise here, in Nyzhny Novgorod (it's the largest in our city and one of the largest
ones across the country; again, those who know the subject will guess it ;) ). He's been programming in
FORTRAN for his entire life. He started at the time when punched cards were in use. I don't blame him for not
studying C++. It was already too late for him 10 years ago - and he's still goes on pretty well. However, there
are certain security precautions at this enterprise 2/3 of the staff of which are engaged in programming one
way or another:
1. No Internet. At all. You need literature - you go to the library. Stack Overflow? What's that? If you need to
send an e-mail, you have to submit a written request to the boss explaining whom and what for you want to
send it. Only a few chosen ones can use the Internet "against a receipt". Thank God, they have an internal
network at least.
2. No administration rights on your computer. Perhaps this restriction makes sense for the white-collar mass,
but I can't imagine a programmer feeling satisfied with it.
3. (Doesn't relate to the subject; just an illustration.) You can't even bring a cell phone with an integrated
camera (have you seen ones without it nowadays?).
As a result, even young employees write code in FORTRAN, while really skilled programmers are very few. I
know that for sure because I trained some guy of 25 whom my father had recommended as a promising
programmer.
Here's my verdict: they are stuck in the 80-s there. Even given that they have pretty good salaries, I wouldn't
go there for the world.
These are just two examples from the intellectual elite's life. I don't mean to discredit anyone - they do their
job well enough, but my heart is bleeding as I watch which windmills my father has to fight sometimes.
(Thank God, I've managed to persuade him to start using git recently.) No OOP in a million-line project, no
static analysis - nothing.
It's just has to do with the human's trait to be very conservative about areas which are not one's strong
points.
Ilja Mayzus. The original comment.
The core of this story is the ideology of treating the computer as a Big Calculator. In that case, you don't
need to know any more about it than its younger brother, the pocket calculator, deserves. And it's the way
it is actually used in many areas. Let's digress for a while and have a look inside the world of physics. Let's
see how another theory finds a confirmation. To do this, I'll again have to quote a large extract from Bryan
Greene's book "The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the Ultimate
Theory" [1]:
4. We all huddled around Morrison's computer in the office he and I shared. Aspinwall told Morrison how to
bring his program up on the screen and showed us the precise form for the required input. Morrison
appropriately formatted the results we had generated the previous night, and we were set to go.
The particular calculation we were performing amounts, roughly speaking, to determining the mass of a
certain particle species—a specific vibrational pattern of a string—when moving through a universe whose
Calabi-Yau component we had spent all fall identifying. We hoped, in line with the strategy discussed earlier,
that this mass would agree identically with a similar calculation done on the Calabi-Yau shape emerging
from the space-tearing flop transition. The latter was the relatively easy calculation, and we had completed
it weeks before; the answer turned out to be 3, in the particular units we were using. Since we were now
doing the purported mirror calculation numerically on a computer, we expected to get something extremely
close to but not exactly 3, something like 3.000001 or 2.999999, with the tiny difference arising from
rounding errors.
Morrison sat at the computer with his finger hovering over the enter button. With the tension mounting he
said, "Here goes," and set the calculation in motion. In a couple of seconds the computer returned its
answer: 8.999999. My heart sank. Could it be that space-tearing flop transitions shatter the mirror relation,
likely indicating that they cannot actually occur? Almost immediately, though, we all realized that something
funny must be going on. If there was a real mismatch in the physics following from the two shapes, it was
extremely unlikely that the computer calculation should yield an answer so close to a whole number. If our
ideas were wrong, there was no reason in the world to expect anything but a random collection of digits. We
had gotten a wrong answer, but one that suggested, perhaps, that we had just made some simple arithmetic
error. Aspinwall and I went to the blackboard, and in a moment we found our mistake: we had dropped a
factor of 3 in the "simpler" calculation we had done weeks before; the true result was 9. The computer
answer was therefore just what we wanted.
Of course, the after-the-fact agreement was only marginally convincing. When you know the answer you
want, it is often all too easy to figure out a way of getting it. We needed to do another example. Having
already written all of the necessary computer code, this was not hard to do. We calculated another particle
mass on the upper Calabi-Yau shape, being careful this time to make no errors. We found the answer: 12.
Once again, we huddled around the computer and set it on its way. Seconds later it returned 11.999999.
Agreement. We had shown that the supposed mirror is the mirror, and hence space-tearing flop transitions
are part of the physics of string theory.
At this I jumped out of my chair and ran an unrestrained victory lap around the office. Morrison beamed
from behind the computer. Aspinwall's reaction, though, was rather different. "That's great, but I knew it
would work," he calmly said. "And where's my beer?"
I truly believe they are geniuses. But let's imagine for a moment that it were some ordinary students who
used this approach to calculate an integral. I don't think programmers would take it seriously then. And
what if the program generated 3 right away? The bug would be taken as the final evidence? I think it would
clear up later, during a re-check by themselves or their scientist colleagues. Still, the "ideal spherical
programmer in vacuum" is scared to death by this fact.
This is how things are in reality. It's not only personal computers used in such a way - it's also cluster
systems exploited for scientific computations. And what's most scary, people trust the results produced by
5. programs. In the future we are going to deal with even more computations of this kind, and the price of
having software bugs will become steeper too.
Isn't it time to change something?
Yes, nobody can forbid me to stick a plaster on a cut myself; I guess I can recommend some medicine to
take when you've caught a cold. But not more than that. I cannot drill a tooth or write out a prescription.
Don't you find it reasonable that developers creating a software system whose responsibility extends
beyond certain scope should also confirm their skill?
I know there exist various certifications. But I'm talking of a different thing now. Certification is intended to
ensure that program code conforms to certain standards. It partially prevents slopwork, in an indirect way.
But the range of areas where certification is a strict requirement is quite narrow. It obviously does not cover
the whole set of areas and situations where careless use of the Big Calculator may do much harm.
Example of the Danger
I guess many of you find my worries too abstract. That's why I suggest examining a few real-life examples.
There is the open-source package Trans-Proteomic Pipeline (TPP) designed to solve various tasks in biology.
No doubt, it is used - by its developers and perhaps some third-party organizations. I believe that any bug in
it is already a potential issue. And does it have bugs? Yes, it does; and still more are appearing. We checked
this project one year ago and reported it in the blog-post "Analysis of the Trans-Proteomic Pipeline (TPP)
project".
What has changed since then? Nothing. The project is going on to develop and accumulate new bugs. The
Big Calculator ideology has won. The developers are not writing a high-quality project with the minimum
number of bugs possible. They simply solve their tasks; otherwise they would have reacted in some way to
the last year article and considered introducing some static analysis tools. I don't mean they must
necessarily choose PVS-Studio; there are numbers of other static code analyzers. The point is that their
responsible application goes on collecting most trivial bugs. Let's see what fresh ones they've got.
1. Some bungler goes on writing incorrect loops
In the previous article I mentioned incorrect loop conditions. The new package version has them too.
double SpectraSTPeakList::calcDot(SpectraSTPeakList* other) {
....
for (i = this->m_bins->begin(), j = other->m_bins->begin();
i != this->m_bins->end(), j != other->m_bins->end();
i++, j++) {
d = (*i) * (*j);
dot += d;
}
6. ....
}
PVS-Studio's diagnostic message: V521 Such expressions using the ',' operator are dangerous. Make sure the
expression is correct. spectrastpeaklist.cpp 504
In the check "i != this->m_bins->end(), j != other->m_bins->end()", the expression before the comma does
not check anything. The ',' operator is used to execute expressions both to the right and to the left of it in
the left-to-right order and return the value of the right expression. This is what the correct check should
look like:
i != this->m_bins->end() && j != other->m_bins->end()
The same defect can be also found in the following fragments:
• spectrastpeaklist.cpp 516
• spectrastpeaklist.cpp 529
• spectrastpeaklist.cpp 592
• spectrastpeaklist.cpp 608
• spectrastpeaklist.cpp 625
• spectrastpeaklist.cpp 696
2. Null pointer dereferencing
This bug won't lead to outputting incorrect computation results - it will cause a crash instead, which is much
better. However, it would be strange not to mention these bugs.
void ASAPRatio_getDataStrctRatio(dataStrct *data, ....)
{
....
int *outliers, *pepIndx=NULL;
....
//pepIndx doesn't change
....
if(data->dataCnts[i] == 1 && pepIndx[i] == 0)
data->dataCnts[i] = 0;
....
}
PVS-Studio's diagnostic message: V522 Dereferencing of the null pointer 'pepIndx' might take place.
asapcgidisplay2main.cxx 534
7. The same defect can be also found in the following fragments:
• Pointer 'peptides'. asapcgidisplay2main.cxx 556
• Pointer 'peptides'. asapcgidisplay2main.cxx 557
• Pointer 'peptides'. asapcgidisplay2main.cxx 558
• Pointer 'peptides'. asapcgidisplay2main.cxx 559
• Pointer 'peptides'. asapcgidisplay2main.cxx 560
• Pointer 'pepIndx'. asapcgidisplay2main.cxx 569
3. Uncleared arrays
static void clearTagNames() {
std::vector<const char *>ptrs;
for (tagname_set::iterator i = tagnames.begin();
i!=tagnames.end();i++) {
ptrs.push_back(*i);
}
for (tagname_set::iterator j = attrnames.begin();
j!=attrnames.end();j++) {
ptrs.push_back(*j);
}
tagnames.empty();
attrnames.empty();
for (size_t n=ptrs.size();n--;) {
delete [] (char *)(ptrs[n]); // cast away const
}
}
In this code the analyzer has caught two uncleared arrays at once:
V530 The return value of function 'empty' is required to be utilized. tag.cxx 72
V530 The return value of function 'empty' is required to be utilized. tag.cxx 73
You should call the clear() function instead of empty().
4. Uninitialized class objects
class ExperimentCycleRecord {
8. public:
ExperimentCycleRecord() {
ExperimentCycleRecord(0,0,0,True,False);
}
ExperimentCycleRecord(long lExperiment, long lCycleStart,
long lCycleEnd, Boolean bSingleCycle,
Boolean bRangleCycle)
{
....
}
....
}
PVS-Studio's diagnostic message: V603 The object was created but it is not being used. If you wish to call
constructor, 'this->ExperimentCycleRecord::ExperimentCycleRecord(....)' should be used.
mascotconverter.cxx 101
The ExperimentCycleRecord() constructor does not do what it is intended to; it doesn't initialize anything.
The developer might be an excellent chemist, but if he doesn't know how to use the C++ language properly,
his computations using uninitialized memory aren't worth a damn. It's like using a dirty test tube.
Instead of calling another constructor, the line "ExperimentCycleRecord(0,0,0,True,False);" creates a
temporary object which will be destroyed after that. This error pattern is discussed in detail in the article
"Wade not in unknown waters. Part one".
The same defect can be also found in the following fragments:
• asapratiopeptideparser.cxx 57
• asapratiopeptidecgidisplayparser.cxx 36
• cruxdiscrimfunction.cxx 36
• discrimvalmixturedistr.cxx 34
• mascotdiscrimfunction.cxx 47
• mascotscoreparser.cxx 37
• tandemdiscrimfunction.cxx 35
• tandemkscoredf.cxx 37
• tandemnativedf.cxx 37
5. Comments breaching execution logic
int main(int argc, char** argv) {
9. ....
if (getIsInteractiveMode())
//p->writePepSHTML();
//p->printResult();
// regression test?
if (testType!=NO_TEST) {
TagListComparator("InterProphetParser",testType,
outfilename,testFileName);
....
}
PVS-Studio's diagnostic message: V628 It's possible that the line was commented out improperly, thus
altering the program's operation logics. interprophetmain.cxx 175
After the 'if' operator, a few lines executing some operations were commented out. As a result, the program
logic changed quite differently than expected. The programmer didn't want any actions to be done after
executing the condition. Instead, the 'if' operator affects the code below. As a consequence, the tests'
output now depends not only on the "testType!=NO_TEST" condition, but on the "getIsInteractiveMode()"
condition as well. That is, the test may not test anything. That's why I strongly recommend not to rely fully
on one testing methodology alone (for example, TDD).
6. Misprints
Misprints are to be found everywhere and all the time. It's not that bad if you get fewer hit points after an
explosion in a game than you ought to, because of such a bug. But what do incorrect data mean when
computing chemical reactions?
void ASAPRatio_getProDataStrct(proDataStrct *data, char **pepBofFiles)
{
....
if (data->indx == -1) {
data->ratio[0] = -2.;
data->ratio[0] = 0.;
data->inv_ratio[0] = -2.;
data->inv_ratio[1] = 0.;
10. return;
}
....
}
PVS-Studio's diagnostic message: V519 The 'data->ratio[0]' variable is assigned values twice successively.
Perhaps this is a mistake. Check lines: 130, 131. asapcgidisplay2main.cxx 131
One and the same variable is by mistake assigned two different values. The correct code is this one:
data->ratio[0] = -2.;
data->ratio[1] = 0.;
This fragment was then copied-and-pasted into other parts of the program:
• asapcgidisplay2main.cxx 338
• asapcgidisplay2main.cxx 465
• asapratioproteincgidisplayparser.cxx 393
• asapratioproteincgidisplayparser.cxx 518
7. Comparing signed and unsigned values
Comparing signed and unsigned values properly requires some skill. Ordinary calculators don't deal with
unsigned values, but the C++ language does.
size_type size() const;
void computeDegenWts()
{
....
int have_cluster = 0;
....
if ( have_cluster > 0 && ppw_ref.size() - have_cluster > 0 )
....
}
PVS-Studio's diagnostic message: V555 The expression 'ppw_ref.size() - have_cluster > 0' will work as
'ppw_ref.size() != have_cluster'. proteinprophet.cpp 6767
The programmer wanted the check "ppw_ref.size() > have_cluster" to be executed. But he got something
very different instead.
11. To make it clearer, let's assume we have the 'size_t' type which is 32-bit. Suppose the function
"ppw_ref.size()" returns 10 while the variable have_cluster equals 15. The function ppw_ref.size() returns
the unsigned type 'size_t'. According to the C++ rules, the right operand in the subtraction operation must
also have the type 'size_t' before subtraction is executed. It's alright for now: we have 10u on the left and
15u on the right.
Here goes the subtraction:
10u - 15u
And here's where we get a trouble. Those very C++ rules tell us that the result of subtraction between two
unsigned variables must also be unsigned.
It means that 10u - 15u = FFFFFFFBu. As you know, 4294967291 is greater than 0.
The Big Calculator Riot is successful. Writing a correct theoretical algorithm is only half of the job. You also
need to write a correct code.
A similar bug can be found in the following fragment:
double SpectraSTPeakList::calcXCorr() {
....
for (int tau = -75; tau <= 75; tau++) {
float dot = 0.0;
for (unsigned int b = 0; b < numBins; b++) {
if (b + tau >= 0 && b + tau < (int)numBins) {
dot += (*m_bins)[b] * theoBins[b + tau] / 10000.0;
}
}
....
....
}
PVS-Studio's diagnostic message: V547 Expression 'b + tau >= 0' is always true. Unsigned type value is
always >= 0. spectrastpeaklist.cpp 2058
As you can see, the variable 'tau' takes values within the range [-75, 75]. To avoid array overrun, the check b
+ tau >= 0 is used. I guess you've already got the point that this check won't work. The variable 'b' has the
12. 'unsigned' modifier. It means that the "b + tau" expression's result is unsigned too. And an unsigned value is
always greater than or equal to 0.
8. Strange loop
const char* ResidueMass::getStdModResidues(....) {
....
for (rmap::const_iterator i = p.first; i != p.second; ++i) {
const cResidue &r = (*i).second;
if (r.m_masses[0].m_nterm) {
n_term_aa_mod = true;
} else if (r.m_masses[0].m_cterm) {
c_term_aa_mod = true;
}
return r.m_residue.c_str();
}
if(! strcmp(mod, "+N-formyl-met (Protein)")) {
return "n";
} if (! strcmp(mod, "13C6-15N2 (K)")) {
return "K";
} if (! strcmp(mod, "13C6-15N4 (R)")) {
return "R";
....
}
PVS-Studio's diagnostic message: V612 An unconditional 'return' within a loop. residuemass.cxx 1442
There is the 'return' operator inside the loop and it's called in any case. The loop can execute only once,
after which the function terminates. It's either a misprint here or some condition is missing before the
'return' operator.
9. Rough calculations
double RTCalculator::getUsedForGradientRate() {
if (rts_.size() > 0)
13. return used_count_ / rts_.size();
return 0.;
}
PVS-Studio's diagnostic message: V636 The 'used_count_ / rts_.size()' expression was implicitly casted from
'int' type to 'double' type. Consider utilizing an explicit type cast to avoid the loss of a fractional part. An
example: double A = (double)(X) / Y;. rtcalculator.cxx 6406
Since the function returns values of the double type, I find it reasonable to suppose the following.
When the variable 'used_count_' is assigned the value 5 and the function rts_.size() returns 7, the
approximate result is 0,714. However, the function getUsedForGradientRate() will return 0 in this case.
The variable 'used_count_' has the 'int' type. The rts_.size() function also returns an 'int' value. An integer
division occurs, and the result is obvious: it's zero. Then zero is implicitly cast to double, but it doesn't
matter at this point.
To fix the defect, the code should be rewritten in the following way:
return static_cast<double>(used_count_) / rts_.size();
Other defects of this kind:
• cgi_pep3d_xml.cxx 3203
• cgi_pep3d_xml.cxx 3204
• asapratiopeptideparser.cxx 4108
10. Great and mighty Copy-Paste
The function setPepMaxProb() contains a few large similarly looking blocks. In this fragment one can feel
that specific smell of the Copy-Paste technique. Using it naturally results in an error. I had to SIGNIFICANTLY
abridge the sample text. The bug is very noticeable in the abridged code, but it's almost impossible to see it
in the original code. Yeah, it's an advertisement of static analysis tools in general and PVS-Studio in
particular.
void setPepMaxProb( bool use_nsp, bool use_fpkm,
bool use_joint_probs, bool compute_spectrum_cnts )
{
double prob = 0.0;
double max2 = 0.0;
double max3 = 0.0;
double max4 = 0.0;
double max5 = 0.0;
14. double max6 = 0.0;
double max7 = 0.0;
....
if ( pep3 ) { ... if ( use_joint_probs && prob > max3 ) ... }
....
if ( pep4 ) { ... if ( use_joint_probs && prob > max4 ) ... }
....
if ( pep5 ) { ... if ( use_joint_probs && prob > max5 ) ... }
....
if ( pep6 ) { ... if ( use_joint_probs && prob > max6 ) ... }
....
if ( pep7 ) { ... if ( use_joint_probs && prob > max6 ) ... }
....
}
V525 The code containing the collection of similar blocks. Check items 'max3', 'max4', 'max5', 'max6', 'max6'
in lines 4664, 4690, 4716, 4743, 4770. proteinprophet.cpp 4664
PVS-Studio's diagnostic message: V525 The code containing the collection of similar blocks. Check items
'max3', 'max4', 'max5', 'max6', 'max6' in lines 4664, 4690, 4716, 4743, 4770. proteinprophet.cpp 4664
Unfortunately, the V525 diagnostic produces many false positives and therefore referred to the third-level
warnings. But if one overcomes one's laziness and study this class of warnings, one may find numbers of
such nice bugs.
11. Pointer is not initialized sometimes
int main(int argc, char** argv) {
....
ramp_fileoffset_t *pScanIndex;
....
if ( (pFI=rampOpenFile(mzXmlPath_.c_str()))==NULL) {
....
} else {
15. ....
pScanIndex = readIndex(pFI, indexOffset, &iAnalysisLastScan);
....
}
....
if (pScanIndex != NULL)
free(pScanIndex);
return 0;
}
PVS-Studio's diagnostic message: V614 Potentially uninitialized pointer 'pScanIndex' used. sqt2xml.cxx 476
This program may crash at the end if the function rampOpenFile() returns NULL. It's not critical yet
unpleasant.
Here's another variable that may remain uninitialized:
• Potentially uninitialized pointer 'fp_' used. dta-xml.cpp 307
12. Virtual destructor missing
class DiscriminantFunction {
public:
DiscriminantFunction(int charge);
virtual Boolean isComputable(SearchResult* result) = 0;
virtual double getDiscriminantScore(SearchResult* result) = 0;
virtual void error(int charge);
protected:
int charge_;
double const_;
}; // class
class CometDiscrimFunction : public DiscriminantFunction;
16. class CruxDiscrimFunction : public DiscriminantFunction;
class InspectDiscrimFunction : public DiscriminantFunction;
.....
class DiscrimValMixtureDistr : public MixtureDistr {
....
DiscriminantFunction* discrim_func_;
....
};
DiscrimValMixtureDistr::~DiscrimValMixtureDistr() {
delete[] posinit_;
delete[] neginit_;
delete discrim_func_;
}
PVS-Studio's diagnostic message: V599 The virtual destructor is not present, although the
'DiscriminantFunction' class contains virtual functions. discrimvalmixturedistr.cxx 206
A number of classes are inherited from the DiscriminantFunction class. For example, such is the class
DiscrimValMixtureDistr. Its destructor frees memory; therefore, it's very desirable that you call it.
Unfortunately, the DiscriminantFunction class's destructor is not declared as a virtual one - with all the
ensuing consequences.
13. Miscellaneous
There are numbers of small defects which won't have serious consequences but are still not very pleasant to
have in your code. There are also strange fragments, but I can't say for sure if they are incorrect. Here's one
of them:
Boolean MixtureModel::iterate(int counter) {
....
if (done_[charge] < 0) {
done_[charge];
}
17. else if (priors_[charge] > 0.0) {
done_[charge] += extraitrs_;
}
....
}
PVS-Studio's diagnostic message: V607 Ownerless expression 'done_[charge]'. mixturemodel.cxx 1558
What is it? Incomplete code? Or maybe the programmer just wanted to point it out that nothing should be
done if the "done_[charge] < 0" condition is true?
And here you are an incorrect way of freeing memory. Any critical consequences are unlikely, but still the
code smells.
string Field::getText(....)
{
....
char* pepString = new char[peplen + 1];
....
delete pepString;
....
}
PVS-Studio's diagnostic message: V611 The memory was allocated using 'new T[]' operator but was released
using the 'delete' operator. Consider inspecting this code. It's probably better to use 'delete [] pepString;'.
pepxfield.cxx 1023
The correct way of doing this is to write "delete [] pepString". There are many other defects of this kind:
• cruxdiscrimvalmixturedistr.cxx 705
• cruxdiscrimvalmixturedistr.cxx 715
• mascotdiscrimvalmixturedistr.cxx 426
• mascotdiscrimvalmixturedistr.cxx 550
• mascotdiscrimvalmixturedistr.cxx 624
• phenyxdiscrimvalmixturedistr.cxx 692
• probiddiscrimvalmixturedistr.cxx 487
• probiddiscrimvalmixturedistr.cxx 659
• tandemdiscrimvalmixturedistr.cxx 731
• tandemdiscrimvalmixturedistr.cxx 741
18. And here's an incorrect implementation of the "--" operator. It doesn't seem to be used anywhere,
otherwise the bug would quickly reveal itself.
CharIndexedVectorIterator operator++(int)
{ // postincrement
CharIndexedVectorIterator _Tmp = *this;
++m_itr;
return (_Tmp);
}
CharIndexedVectorIterator& operator--()
{ // predecrement
++m_itr;
return (*this);
}
PVS-Studio's diagnostic message: V524 It is odd that the body of '--' function is fully equivalent to the body
of '++' function. charindexedvector.hpp 81
The operators "--" and "++" are implemented in the same way. They must have been copied-and-pasted
then:
• charindexedvector.hpp 87
• charindexedvector.hpp 159
• charindexedvector.hpp 165
Let's stop here. It all is not very interesting, and the article is big enough. As usual, I'm urging the developers
not to limit themselves to fixing only the mentioned defects. Download and check the project with PVS-Studio
yourself. I could have missed many errors. We can even grant you a free registration key for some
time.
Summary
Unfortunately, the article has appeared a bit tangled. What did the author want to say, after all? I'll try to
repeat in a very brief form my ideas I want to share with you.
1. We are currently using more and more programs to perform scientific and engineering
computations and simulate various processes, and we grow to trust them.
2. Programs get very complicated. Professional programmers understand it very well that one cannot
approach the task of creating a software package for computer simulation in the same way as using
19. a software calculator. The growth of software complexity leads to an exponential increase of the
number of errors [2].
3. It appears that physicists/biologists/medics cannot simply calculate something in the usual manner.
One cannot ignore the software complexity increase and the consequences of incorrect
computations arising from imperfect knowledge of a programming language.
4. In this article I've given arguments to prove that this is the real state of things. The first quotation
tells us that people tend to treat the computer as an ordinary calculator. The second quotation just
reaffirms this idea. The error samples discussed after that are meant to demonstrate that people
really make mistakes when treating computer simulation software in such a way. So, my anxiety has
solid ground.
So, what shall we do?
First of all, I'd like you to realize this problem and tell your colleagues from related areas. It's been clear to
programmers for a long time that the software complexity growth and silly mistakes in large projects may
easily turn into a source of great harm. On the other hand, those people who treat programming and
computers just as a tool don't know that and don't bother to think about it. So, we need to draw their
attention to this problem.
Here you are an analogy. Imagine a man who has got him a cudgel and starts hunting some animals. The
cudgel in his hands gradually turns into a stone axe, then a sword, and finally a gun. But he still uses it just to
stun hares by hitting them on the head. It's not only that this way of using the weapon is absolutely
inefficient; it has also become much more dangerous now (he can accidentally shoot himself or his fellow
men). Hunters from the "programmers" tribe quickly adapt themselves to these changes. The rest don't
have time for that - they are busy hunting hares. After all, it's all about the hares. We need to tell these
people that they have to learn, whether they like it or not. It'll improve everyone's life. And waving your gun
around is no good.
References
1. Bryan Greene "The Elegant Universe: Superstrings, Hidden Dimensions, and the Quest for the
Ultimate Theory. ISBN 978-0375708114
2. Andrey Karpov. Feelings confirmed by numbers. http://www.viva64.com/en/b/0158/