The U.S. White House Office of Management and Budget (OMB) is considering a new policy for sharing source code for software created by or for government projects. There’s a lot to love about the proposed policy: it would make it easier for people to find and reuse government software, and explicitly encourages government agencies to prioritize free and open source software options in their procurement decisions.
EFF submitted a comment on the policy through the White House’s GitHub repository (you can also download our comment as a PDF). The OMB is encouraging people to send comments through GitHub, reply to and +1 each other’s comments, and even offer direct edits to the policy via pull requests.
The biggest news about Docker 1.11 isn't features in the application, but that it uses component versions standardized under the aegis of the Open Container Project.
The Open Container Initiative (OCI) has taken the next step in establishing a standards base for the emerging container ecosystem. The organization has launched a project to establish a container image format specification.
Docker has issued a report based on a survey of more than 500 people currently using and deploying container technology in various stages. The full report, “Evolution of the Modern Software Supply Chain” is available here: https://www.docker.com/survey-2016
The report sheds light on container technology in general as well as how Docker is fitting in in the ecosystem. Among other findings, the report noted that Docker is central to many hybrid cloud/multi-cloud strategies. In fact 80 percent of respondents using Docker describe it as part of their cloud strategy for a variety of reasons including migration, hybrid cloud portability and avoiding lock-in to a single cloud vendor.
Another complementary approach to standards development is the release of designs and specifications into the open source community as open hardware and interface standards for others to adopt. Examples include Arduino, Raspberry Pi, and Beaglebone, which enable quick prototyping, as well as the mangOH open hardware reference design, an open source design that is more easily scalable in commercial settings and is built specifically for IoT cellular connectivity.
Open source platforms like these enable developers that may have limited hardware, wireless or low-level software expertise to start developing IoT applications in days—rather than months. If executed properly, these can significantly reduce the time and effort to get prototypes from paper to production by ensuring that various connectors and sensors work together automatically with no additional coding required. With industrial-grade specifications, these next-generation platforms not only allow quick prototyping, but also rapid industrialization of IoT applications.
When big data mavens debate the merits of using Apache Spark versus Apache Storm for streaming data processing, the argument usually sounds like this: Sure, Storm has great scale and speed, but it's hard to use. Plus, it's slowly being overtaken by Spark, so why go with old and busted when there's new and hot?
That's why Apache Storm 1.0 hopes to turn the ship around, not only by making it faster but by also easier and more convenient to work with.
The Dutch government should set up a resource centre on free software and open standards, says Member of Parliament Astrid Oosenbrug. “There is a serious lack of understanding of these two topics in the government”, the MP says. The centre should remedy this, and Ms Oosenburg has started studying possibilities and options.
Cumulus Networks, Dell and Red Hat have forged a partnership to bring to DevOps efficiencies to the open source cloud by automating networking and deployment for OpenStack clusters, according to news announced this morning.
Cumulus Networks, the leading provider of Linux networking operating systems, today announced a collaboration with Dell, the leading provider of open and innovative technologies, and Red Hat, Inc., the world's leading provider of open source solutions, to simplify large-scale OpenStack deployments without the need for any proprietary software-defined networking (SDN) fabric solutions. The resulting solution offers an all-Linux OpenStack pod that is easy to install and maintain, and incorporates the latest networking technologies.
Results from the seventh OpenStack Foundation user survey are out, and they paint a picture of a powerful cloud platform that has squarely moved from the evaluation stage at many enterprises to deployment stage. Sixty-five percent of OpenStack deployments are now in production, 33 percent more than a year ago, according to the findings. And 97 percent of community members said that “standardizing on the same open platform and APIs that power a global network of public and private clouds” was one of their top five considerations in choosing OpenStack.
Sharone Zitzman is no stranger to community. As a lead for the Cloudify open source community at GigaSpaces, and an organizer of many local events including OpenStack Israel, DevOps Days Tel Aviv, and the DevOps Israel meetup group, she knows well what it means to be involved with bringing people together for common goals across open source projects.
As reported here earlier this week, the Hortonworks' Hadoop Summit has been underway in Dublin, Ireland, and one of the biggest pieces of news there was that Pivotal, already a player in the Hadoop distribution arena, will be reselling Hortonworks Data Platform (HDP), which is Hortonworks' Hadoop platform. A corollary piece of news is that Pivotal is also shifting from focusing on its own distribution to the Hortonworks platform.
Hortonworks this week announced a series of enterprise security efforts to bolster performance and data safety with its Hortonworks Data Platform.
The company announced Tuesday that Pivotal Software will standardize on Hortonworks' Hadoop distribution. Hortonworks also will resell extract, transform and load tools developed by Syncsort.
The thrust of the Hortonworks' product announcements, which were made in conjunction with its Hadoop Summit, concerned updates on applying security policies and maintaining data governance to simplify the provisioning of clusters in hybrid clouds. Those procedures were designed to make it easier for customers to interactively explore data in Hadoop.
A local government digital service standard has been agreed and published after taking into account the views of council staff in a consultation last month.
The standard is a common approach for local authorities to deliver good quality, user centred, value for money digital services - and is a local government version of the original Government Digital Service Standard used across central government.
The New Zealand Government Open Access and Licensing (NZGOAL) frame work is being extended to incorporate software licensing. The draft below is an initial draft for which we are seeking feedback on. The intention of this extension to NZGOAL is to ensure that publicly funded bespoke software is appropriately licensed to enable reuse by the public as well as government. This should enable more efficient maintenance and improvement, and potentially accelerate innovation going forward.