Security and FUD
-
Surviving the Frequency of Open Source Vulnerabilities
One hurdle in any roll-your-own Linux platform development project is getting the necessary tools to build system software, application software, and the Linux kernel for your target embedded device. Many developers use a set of tools based on the GNU Compiler Collection, which requires two other software packages: a C library used by the compiler; and a set of tools required to create executable programs and associated libraries for your target device. The end result is a toolchain.
[...]
In preference to working on features or product differentiation, developers often spend valuable time supporting, maintaining, and updating a cross-compilation environment, Linux kernel, and root file system. All of which, requires a significant investment of personnel and wide range of expertise.
-
Netgate® Extends Free pfSense® Support and Lowers pfSense Support Subscription Pricing to Aid in COVID-19 Relief
Free zero-to-ping support, free VPN configuration and connection support, free direct assistance for first responder | front line healthcare agencies, and reduced pfSense TAC support subscription prices all introduced
-
How the hackers are using Open Source Libraries to their advantage [Ed: Conflating hackers with crackers]
Ben Porter, Chief Product Officer at Instaclustr, writes about how the potential of Open Source Libraries must be balanced with the growing risk of library jacking by hackers.
-
Three Cases Where the Open Source Model Didn't Work [Ed: Lots of anti-GPL FUD and not taking any account of Microsoft crimes, monopoly abuse, bribes and blackmail]
So, why didn’t the open source model work in these three cases?
The main reason is that in all of these cases, data structure specs and the description of algorithms are not the most important piece of the picture.
The root of the problem is in the variety of real-life situations where bugs and failures may occur and lead to a data-loss situations, which is a total no-go in the real world.
The open source community is successful, though it has been in create open source programs and platforms, is still no guarantee of industrial-grade software development(3). The core to success in developing a highly reliable solution is a carefully nurtured auto-test environment. This assures a careful track record and in-depth analysis for every failure, as well as effective work-flow, making sure any given bug or failure never repeats. It’s obvious that building such an environment can take years, if not decades, and the main thing here is not to know how something should work according to specs, but to know how and where exactly it fails. In other words, the main problem is not the resources needed to develop the code, the main problem is time needed to build up a reliable test-coverage that will provide a sufficient barrier for data-loss bugs.
Another problem with open source is that it is usually accompanied by a GPL license. This limits the contribution to such projects almost solely to the open source community itself. One of the major requirements of the GPL license is to disclose changes to source code in case of further distribution, making it pointless for commercial players to participate.
- Login or register to post comments
- Printer-friendly version
- 5266 reads
- PDF version
More in Tux Machines
- Highlights
- Front Page
- Latest Headlines
- Archive
- Recent comments
- All-Time Popular Stories
- Hot Topics
- New Members
digiKam 7.7.0 is releasedAfter three months of active maintenance and another bug triage, the digiKam team is proud to present version 7.7.0 of its open source digital photo manager. See below the list of most important features coming with this release. |
Dilution and Misuse of the "Linux" Brand
|
Samsung, Red Hat to Work on Linux Drivers for Future TechThe metaverse is expected to uproot system design as we know it, and Samsung is one of many hardware vendors re-imagining data center infrastructure in preparation for a parallel 3D world. Samsung is working on new memory technologies that provide faster bandwidth inside hardware for data to travel between CPUs, storage and other computing resources. The company also announced it was partnering with Red Hat to ensure these technologies have Linux compatibility. |
today's howtos
|
Recent comments
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago
1 year 11 weeks ago