Language Selection

English French German Italian Portuguese Spanish

Linux.com

Syndicate content
News For Open Source Professionals
Updated: 3 hours 38 min ago

Linux Foundation Public Health Joins The Fight Against COVID-19 Pandemic

Monday 25th of January 2021 08:10:31 PM

Brian Behlendorf is one of the most respected luminaries of the open-source world. He has been heading the Linux Foundation’s Hyperledger project since its inception and recently took over additional responsibilities of the Linux Foundation Public Health. With the new administration sworn in, there will be an increased focus on science-backed public health efforts, and the foundation is best positioned to help the public sector with the strategic availability of open-source technologies to tackle serious health crises. In this interview, we dived deep into the scope of the Linux Foundation Public Health project. It’s going to look beyond the COVID-19 pandemic and address many other public health issues that we may see due to climate change and so on.



The post Linux Foundation Public Health Joins The Fight Against COVID-19 Pandemic appeared first on Linux.com.

The Maple Tree, A Modern Data Structure for a Complex Problem

Thursday 21st of January 2021 02:00:00 AM

In recent years, processors have experienced growth in core counts which have pushed software to be multi-threaded and increased contention in the virtual memory data structure. The memory management subsystem uses the mmap_sem lock for write protection of the VMAs. Optimizing the mmap_sem lock into a rw-semaphore helped contention but did not solve the underlying issue. Even with a single threaded program and a well-intended system admin, contention does arise through proc file accesses for application monitoring.

In this blog, we introduce a new data structure that can track gaps, store ranges, and be implemented in an RCU compatible manner. This is the Maple Tree.
Click to Read More at Oracle Linux Kernel Development

The post The Maple Tree, A Modern Data Structure for a Complex Problem appeared first on Linux.com.

Announcing the Unbreakable Enterprise Kernel Release 5 Update 4 for Oracle Linux

Wednesday 20th of January 2021 10:18:23 PM

A summary of what’s new in Unbreakable Enterprise Kernel Release 5 Update 4.
Click to Read More at Oracle Linux Kernel Development

The post Announcing the Unbreakable Enterprise Kernel Release 5 Update 4 for Oracle Linux appeared first on Linux.com.

NVMe over TCP

Wednesday 20th of January 2021 10:18:22 PM

This post describes how set up Oracle Linux for NVMe over Fabrics to use a standard Ethernet network without having to purchase special RDMA-capable network hardware.
Click to Read More at Oracle Linux Kernel Development

The post NVMe over TCP appeared first on Linux.com.

An inside look at CVE-2020-10713, a.k.a. the GRUB2 “BootHole”

Wednesday 20th of January 2021 10:18:22 PM

The inside story of how CVE-2020-10713, a.k.a. the GRUB2 “BootHole” vulnerability was reported and resolved.
Click to Read More at Oracle Linux Kernel Development

The post An inside look at CVE-2020-10713, a.k.a. the GRUB2 “BootHole” appeared first on Linux.com.

Extracting kernel stack function arguments from Linux x86-64 kernel crash dumps

Wednesday 20th of January 2021 10:18:21 PM

This blog post covers in detail how to extract stack function arguments from kernel crash dumps.
Click to Read More at Oracle Linux Kernel Development

The post Extracting kernel stack function arguments from Linux x86-64 kernel crash dumps appeared first on Linux.com.

Migrate NFS to GlusterFS and nfs-ganesha

Wednesday 20th of January 2021 10:18:20 PM

This article covers how to migrate an NFS server from kernel space to userspace, which is based on Glusterfs and nfs-ganesha.
Click to Read More at Oracle Linux Kernel Development

The post Migrate NFS to GlusterFS and nfs-ganesha appeared first on Linux.com.

struct page, the Linux physical page frame data structure

Wednesday 20th of January 2021 10:18:19 PM

Gain insight into the Linux physical page frame data structure struct page and how to safely use various fields in the structure.
Click to Read More at Oracle Linux Kernel Development

The post struct page, the Linux physical page frame data structure appeared first on Linux.com.

Check out the Oracle talks at KVM Forum 2020

Wednesday 20th of January 2021 10:18:18 PM

The annual KVM forum conference is next week. It brings together the world’s leading experts on Linux virtualization technology to present their latest work. The conference is virtual this year, with live attendance from October 28-30, or check out the recordings once they are available! https://events.linuxfoundation.org/kvm-forum. We have a good number of engineers from the Oracle Linux kernel development team who will be presenting their work…
Click to Read More at Oracle Linux Kernel Development

The post Check out the Oracle talks at KVM Forum 2020 appeared first on Linux.com.

Multithreaded Struct Page Initialization

Wednesday 20th of January 2021 10:18:17 PM

Oracle Linux kernel developer Daniel Jordan contributes this post on the initial support for multithreaded jobs in padata.     The last padata blog described unbinding padata jobs from specific CPUs. This post will cover padata’s initial support for multithreading CPU-intensive kernel paths, which takes us to the memory management system. The Bottleneck During boot, the kernel needs to…
Click to Read More at Oracle Linux Kernel Development

The post Multithreaded Struct Page Initialization appeared first on Linux.com.

QEMU Live Update

Wednesday 20th of January 2021 10:18:16 PM

In this blog Oracle Linux Kernel engineers Steve Sistare and Mark Kanda present QEMU live update.   The ability to update software with critical bug fixes and security mitigations while minimizing downtime is extremely important to customers and cloud service providers. In this blog post, we present QEMU Live Update, a new method for updating a running QEMU instance to a new…
Click to Read More at Oracle Linux Kernel Development

The post QEMU Live Update appeared first on Linux.com.

How to setup WireGuard on Oracle Linux

Wednesday 20th of January 2021 10:18:15 PM

Oracle Linux engineer William Kucharski provides an introduction to the VPN protocol WireGuard   WireGuard has received a lot of attention of late as a new, easier to use VPN mechanism, and it has now been added to Unbreakable Enterprise Kernel 6 Update 1 as a technology preview. But what is it, and how do I use it? What is…
Click to Read More at Oracle Linux Kernel Development

The post How to setup WireGuard on Oracle Linux appeared first on Linux.com.

Blacks In Technology and The Linux Foundation Partner to Offer up to $100,000 in Training & Certification to Deserving Individuals

Tuesday 19th of January 2021 03:27:13 PM

Program will provide verifiable, respected industry credentials to help promising individuals start an IT career

SAN FRANCISCO, January 19, 2021The Linux Foundation, the nonprofit organization enabling mass innovation through open source, and The Blacks In Technology Foundation, the largest community of Black technologists globally, today announced the launch of a new scholarship program to help more Black individuals get started with an IT career.

Blacks in Technology will award 50 scholarships per quarter to promising individuals. The Linux Foundation will provide each of these recipients with a voucher to register for any Linux Foundation administered certification exam at no charge, such as the Linux Foundation Certified IT Associate, Certified Kubernetes Administrator, Linux Foundation Certified System Administrator and more. Associated online training courses will also be provided at no cost when available for the exam selected. Each recipient will additionally receive one-on-one coaching with a Blacks In Technology mentor each month to help them stay on track in preparing for their exam. 

All Linux Foundation certification exams are conducted online with a proctor monitoring virtually via webcam and screen sharing. Scholarship recipients will have six months to sit for their exam, and should they fail to pass on the first attempt, one retake will be provided. Upon passing a certification exam, they will receive a PDF certificate and a digital badge which can be displayed on digital resumes and social media profiles, and which can be independently verified by potential employers. 

“We are extremely pleased to expand our partnership with Blacks in Technology to make quality open source education and certification more accessible to aspiring Black IT professionals,” said Linux Foundation SVP & GM of Training & Certification Clyde Seepersad. “While we have taken steps at The Linux Foundation to increase diversity in the open source community, there is a long way yet to go. There is so much potential talent out there, but without the resources and opportunities to nurture it, much will remain unfulfilled. We hope this program will help scholarship recipients start on the path to becoming successful IT professionals who can go on to mentor the next generation.”

“By removing the financial barrier to entry for our members, The Linux Foundation has empowered a new wave of diverse technical experts.” according to Dennis Schultz, Executive Director of the Blacks In Technology Foundation. “By offering training and certification options for all experience levels, we can meet people where they are in their technical journey and provide support along the way for long term success.”

Those interested in applying for a Blacks in Technology/Linux Foundation scholarship can do so by visiting https://foundation.blacksintechnology.net/programs/

About Blacks in Technology

The Blacks In Technology Foundation is a 501(c)(3) non-profit and the largest global community of Black technologists with a combined membership and social media reach of over 50,000. Membership in Blacks In Technology is free. The Blacks In Technology (BIT) Foundation’s goal and mission is to “stomp the divide” between Black workers and the rest of the tech industry and to fundamentally influence and effect change. BIT intends to level the playing field through training, education, networking, and mentorship with the support of allies, partners, sponsors, and members. For more information please visit blacksintechnology.net

About the Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

# # #

The post Blacks In Technology and The Linux Foundation Partner to Offer up to $100,000 in Training & Certification to Deserving Individuals appeared first on Linux Foundation – Training.

The post Blacks In Technology and The Linux Foundation Partner to Offer up to $100,000 in Training & Certification to Deserving Individuals appeared first on Linux.com.

Review of Container-to-Container Communications in Kubernetes

Friday 15th of January 2021 04:15:34 PM

This article was originally posted at TheNewStack.

By Matt Zand and Jim Sullivan

Kubernetes is a containerized solution. It provides virtualized runtime environments called Pods, which house one or more containers to provide a virtual runtime environment. An important aspect of Kubernetes is container communication within the Pod. Additionally, an important area of managing the Kubernetes network is to forward container ports internally and externally to make sure containers within a Pod communicate with one another properly. To manage such communications, Kubernetes offers the following four networking models:

  • Container-to-Container communications
  • Pod-to-Pod communications
  • Pod-to-Service communications
  • External-to-Internal communications

In this article, we dive into Container-to-Container communications, by showing you ways in which containers within a pod can network and communicate.

Communication Between Containers in a Pod

Having multiple containers in a single Pod makes it relatively straightforward for them to communicate with each other. They can do this using several different methods. In this article, we discuss two methods: i- Shared Volumes and ii-Inter-Process Communications in more detail.

I- Shared Volumes in a Kubernetes Pod

In Kubernetes, you can use a shared Kubernetes Volume as a simple and efficient way to share data between containers in a Pod. For most cases, it is sufficient to use a directory on the host that is shared with all containers within a Pod.

Kubernetes Volumes enables data to survive container restarts, but these volumes have the same lifetime as the Pod. This means that the volume (and the data it holds) exists exactly as long as that Pod exists. If that Pod is deleted for any reason, even if an identical replacement is created, the shared Volume is also destroyed and created from scratch.

A standard use case for a multicontainer Pod with a shared Volume is when one container writes logs or other files to the shared directory, and the other container reads from the shared directory. For example, we can create a Pod like so:

In this example, we define a volume named html. Its type is emptyDir, which means that the Volume is first created when a Pod is assigned to a node and exists as long as that Pod is running on that node; as the name says, it is initially empty. The first container runs the Nginx server and has the shared Volume mounted to the directory /usr/share/nginx/html. The second container uses the Debian image and has the shared Volume mounted to the directory /html. Every second, the second container adds the current date and time into the index.html file, which is located in the shared Volume. When the user makes an HTTP request to the Pod, the Nginx server reads this file and transfers it back to the user in response to the request. Here is a good article for reading more on similar Kubernetes topics.

You can check that the pod is working either by exposing the nginx port and accessing it using your browser, or by checking the shared directory directly in the containers:

II- Inter-Process Communications (IPC)

Containers in a Pod share the same IPC namespace, which means they can also communicate with each other using standard inter-process communications such as SystemV semaphores or POSIX shared memory. Containers use the strategy of the localhost hostname for communication within a pod.

In the following example, we define a Pod with two containers. We use the same Docker image for both. The first container is a producer that creates a standard Linux message queue, writes a number of random messages, and then writes a special exit message. The second container is a consumer which opens that same message queue for reading and reads messages until it receives the exit message. We also set the restart policy to “Never”, so the Pod stops after the termination of both containers.

To check this out, create the pod using kubectl create and watch the Pod status:

Now you can check logs for each container and verify that the second container received all messages from the first container, including the exit message:

There is one major problem with this Pod, however, and it has to do with how containers start up.

Conclusion

The primary reason that Pods can have multiple containers is to support helper applications that assist a primary application. Typical examples of helper applications are data pullers, data pushers and proxies. An example of this pattern is a web server with a helper program that polls a git repository for new updates.

The Volume in this exercise provides a way for containers to communicate during the life of the Pod. If the Pod is deleted and recreated, any data stored in the shared Volume is lost. In this article, we also discussed the concept of Inter-Process Communications among containers within a Pod, which is an alternative to shared Volume concepts. Now that you learn how containers inside a Pod can communicate and exchange data, you can move on to learn other Kubernetes networking models — such as Pod-to-Pod or Pod-to-Service communications. Here is a good article for learning more advanced topics on Kubernetes development.

About the Authors Matt Zand Matt is is a serial entrepreneur and the founder of three successful tech startups: DC Web Makers, Coding Bootcamps and High School Technology Services. He is a leading author of Hands-on Smart Contract Development with Hyperledger Fabric book by O’Reilly Media. Jim Sullivan Jim has a bachelor’s degree in Electrical Engineering and a Master’s Degree in Computer Science. Jim also holds an MBA. Jim has been a practicing software engineer for 18 years. Currently, Jim leads an expert team in Blockchain development, DevOps, Cloud, application development, and the SAFe Agile methodology. Jim is an IBM Master Instructor.

The post Review of Container-to-Container Communications in Kubernetes appeared first on Linux Foundation – Training.

The post Review of Container-to-Container Communications in Kubernetes appeared first on Linux.com.

How to Create and Manage Archive Files in Linux

Friday 15th of January 2021 04:15:33 PM

By Matt Zand and Kevin Downs

In a nutshell, an archive is a single file that contains a collection of other files and/or directories. Archive files are typically used for a transfer (locally or over the internet) or make a backup copy of a collection of files and directories which allow you to work with only one file (if compressed, it has a lower size than the sum of all files within it) instead of many. Likewise, archives are used for software application packaging. This single file can be easily compressed for ease of transfer while the files in the archive retain the structure and permissions of the original files.

We can use the tar tool to create, list, and extract files from archives. Archives made with tar are normally called “tar files,” “tar archives,” or—since all the archived files are rolled into one—“tarballs.”

This tutorial shows how to use tar to create an archive, list the contents of an archive, and extract the files from an archive. Two common options used with all three of these operations are ‘-f’ and ‘-v’: to specify the name of the archive file, use ‘-f’ followed by the file name; use the ‘-v’ (“verbose”) option to have tar output the names of files as they are processed. While the ‘-v’ option is not necessary, it lets you observe the progress of your tar operation.

For the remainder of this tutorial, we cover 3 topics: 1- Create an archive file, 2- List contents of an archive file, and 3- Extract contents from an archive file. We conclude this tutorial by surveying 6 practical questions related to archive file management. What you take away from this tutorial is essential for performing tasks related to cybersecurity and cloud technology.

1- Creating an Archive File

To create an archive with tar, use the ‘-c’ (“create”) option, and specify the name of the archive file to create with the ‘-f’ option. It’s common practice to use a name with a ‘.tar’ extension, such as ‘my-backup.tar’. Note that unless specifically mentioned otherwise, all commands and command parameters used in the remainder of this article are used in lowercase. Keep in mind that while typing commands in this article on your terminal, you need not type the $ prompt sign that comes at the beginning of each command line.

Give as arguments the names of the files to be archived; to create an archive of a directory and all of the files and subdirectories it contains, give the directory’s name as an argument.

 To create an archive called ‘project.tar’ from the contents of the ‘project’ directory, type:

$ tar -cvf project.tar project

This command creates an archive file called ‘project.tar’ containing the ‘project’ directory and all of its contents. The original ‘project’ directory remains unchanged.

Use the ‘-z’ option to compress the archive as it is being written. This yields the same output as creating an uncompressed archive and then using gzip to compress it, but it eliminates the extra step.

 To create a compressed archive called ‘project.tar.gz’ from the contents of the ‘project’ directory, type:

$ tar -zcvf project.tar.gz project

This command creates a compressed archive file, ‘project.tar.gz’, containing the ‘project’ directory and all of its contents. The original ‘project’ directory remains unchanged.

NOTE: While using the ‘-z’ option, you should specify the archive name with a ‘.tar.gz’ extension and not a ‘.tar’ extension, so the file name shows that the archive is compressed. Although not required, it is a good practice to follow.

Gzip is not the only form of compression. There is also bzip2 and and xz. When we see a file with an extension of xz we know it has been compressed using xz. When we see a file with the extension of .bz2 we can infer it was compressed using bzip2. We are going to steer away from bzip2 as it is becoming unmaintained and focus on xz. When compressing using xz it is going to take longer for the files to compressed. However, it is typically worth the wait as the compression is much more effective, meaning the resulting file will usually be smaller than other compression methods used. Even better is the fact that decompression, or expanding the file, is not much different between the different methods of compression. Below we see an example of how to utilize xz when compressing a file using tar

  $ tar -Jcvf project.tar.xz project

We simply switch -z for gzip to uppercase -J for xz. Here are some outputs to display the differences between the forms of compression:

As you can see xz does take the longest to compress. However it does the best job of reducing files size, so it’s worth the wait. The larger the file is the better the compression becomes too!

2- Listing Contents of an Archive File

To list the contents of a tar archive without extracting them, use tar with the ‘-t’ option.

 To list the contents of an archive called ‘project.tar’, type:

$ tar -tvf project.tar  

This command lists the contents of the ‘project.tar’ archive. Using the ‘-v’ option along with the ‘-t’ option causes tar to output the permissions and modification time of each file, along with its file name—the same format used by the ls command with the ‘-l’ option.

 To list the contents of a compressed archive called ‘project.tar.gz’, type:

$ tar -tvf project.tar

 3- Extracting contents from an Archive File

To extract (or unpack) the contents of a tar archive, use tar with the ‘-x’ (“extract”) option.

 To extract the contents of an archive called ‘project.tar’, type:

$ tar -xvf project.tar

This command extracts the contents of the ‘project.tar’ archive into the current directory.

If an archive is compressed, which usually means it will have a ‘.tar.gz’ or ‘.tgz’ extension, include the ‘-z’ option.

 To extract the contents of a compressed archive called ‘project.tar.gz’, type:

$ tar -zxvf project.tar.gz

NOTE: If there are files or subdirectories in the current directory with the same name as any of those in the archive, those files will be overwritten when the archive is extracted. If you don’t know what files are included in an archive, consider listing the contents of the archive first.

Another reason to list the contents of an archive before extracting them is to determine whether the files in the archive are contained in a directory. If not, and the current directory contains many unrelated files, you might confuse them with the files extracted from the archive.

To extract the files into a directory of their own, make a new directory, move the archive to that directory, and change to that directory, where you can then extract the files from the archive.

Now that we have learned how to create an Archive file and list/extract its contents, we can move on to discuss the following 9 practical questions that are frequently asked by Linux professionals.

  • Can we add content to an archive file without unpacking it?

Unfortunately, once a file has been compressed there is no way to add content to it. You would have to “unpack” it or extract the contents, edit or add content, and then compress the file again. If it’s a small file this process will not take long. If it’s a larger file then be prepared for it to take a while.

  • Can we delete content from an archive file without unpacking it?

This depends on the version of tar being used. Newer versions of tar will support a –delete.

For example, let’s say we have files file1 and file2 . They can be removed from file.tar with the following:

$ tar -vf file.tar –delete file1 file2

To remove a directory dir1:

$ tar -f file.tar –delete dir1/*

  • What are the differences between compressing a folder and archiving it?

The simplest way to look at the difference between archiving and compressing is to look at the end result. When you archive files you are combining multiple files into one. So if we archive 10 100kb files you will end up with one 1000kb file. On the other hand if we compress those files we could end up with a file that is only a few kb or close to 100kb.

  • How to compress archive files?

As we saw above you can create and archive files using the tar command with the cvf options. To compress the archive file we made there are two options; run the archive file through compression such as gzip. Or use a compression flag when using the tar command. The most common compression flags are- z for gzip, -j for bzip and -J for xz. We can see the first method below:

$ gzip file.tar

Or we can just use a compression flag when using the tar command, here we’ll see the gzip flag “z”:

$ tar -cvzf file.tar /some/directory

  • How to create archives of multiple directories and/or files at one time?

It is not uncommon to be in situations where we want to archive multiple files or directories at once. And it’s not as difficult as you think to tar multiple files and directories at one time. You simply supply which files or directories you want to tar as arguments to the tar command:

$ tar -cvzf file.tar file1 file2 file3

or

$ tar -cvzf file.tar /some/directory1 /some/directory2

  • How to skip directories and/or files when creating an archive?

You may run into a situation where you want to archive a directory or file but you don’t need certain files to be archived. To avoid archiving those files or “exclude” them you would use the –exclude option with tar:

$ tar –exclude ‘/some/directory’ -cvf file.tar /home/user

So in this example /home/user would be archived but it would exclude the /some/directory if it was under /home/user. It’s important that you put the –exclude option before the source and destination as well as to encapsulate the file or directory being excluded with single quotation marks.

Summary

The tar command is useful for creating backups or compressing files you no longer need. It’s good practice to back up files before changing them. If something doesn’t work how it’s intended to after the change you will always be able to revert back to the old file. Compressing files no longer in use helps keep systems clean and lowers the disk space usage. There are other utilities available but tar has reigned supreme for its versatility, ease of use and popularity.

Resources

If you like to learn more about Linux, reading the following articles and tutorials are highly recommended:

About the Authors

Matt Zand is a serial entrepreneur and the founder of 3 tech startups: DC Web Makers, Coding Bootcamps and High School Technology Services. He is a leading author of Hands-on Smart Contract Development with Hyperledger Fabric book by O’Reilly Media. He has written more than 100 technical articles and tutorials on blockchain development for Hyperledger, Ethereum and Corda R3 platforms. At DC Web Makers, he leads a team of blockchain experts for consulting and deploying enterprise decentralized applications. As chief architect, he has designed and developed blockchain courses and training programs for Coding Bootcamps. He has a master’s degree in business management from the University of Maryland. Prior to blockchain development and consulting, he worked as senior web and mobile App developer and consultant, angel investor, business advisor for a few startup companies. You can connect with him on LI: https://www.linkedin.com/in/matt-zand-64047871

Kevin Downs is Red Hat Certified System Administrator or RHCSA. At his current job at IBM as Sys Admin, he is in charge of administering hundreds of servers running on different Linux distributions. He is a Lead Linux Instructor at Coding Bootcamps where he has authored 5 self-paced Courses.

The post How to Create and Manage Archive Files in Linux appeared first on Linux Foundation – Training.

The post How to Create and Manage Archive Files in Linux appeared first on Linux.com.

Prepr Partners with the Linux Foundation to Provide Digital Work-Integrated Learning through the F.U.N.™ Program

Friday 15th of January 2021 04:15:31 PM

December 14th, 2020 – Toronto, Canada – Prepr is excited to announce a new partnership with The Linux Foundation, the nonprofit organization enabling mass innovation through open source, that will give work-integrated learning experiences to youth facing employment barriers. The new initiative, the Flexible Upskilling Network (F.U.N.) program, launches in collaboration with the Magnet Network and the Network for the Advancement of Black Communities (NABC). The F.U.N. program is a blended learning program, where participants receive opportunities to combine valuable work experience with digital skill development over a 16-week journey. The objective of the F.U.N. program is to support youth, with a focus on women and visible minority groups who are involuntarily not in employment, education, or training (NEET) in Ontario, by helping them gain employability skills, including soft skills like communication, collaboration, and problem-solving.

Caitlin McDonough, Chief Education Officer at Prepr, says about the F.U.N. program, “Digital skills are essential for the workforce of the future. We at Prepr, are looking forward to the opportunity to support youth capacity development for the future of work.”

With The Linux Foundation, Prepr is committed to supporting over 180 youth participants enrolling and completing the F.U.N program between July 2020 and March 2021. Prepr will be using its signature PIE ® method to train the participants in Project Leadership, Innovation, and Entrepreneurship to expose them to real-world business challenges. The work-integrated learning experience Prepr provides will support participants in developing both soft and hard skills, with a focus on digital skills to help them secure gainful employment for the uncertain future of work.

“In this day and age, it is essential to have a good educational foundation in technology to maximize your chances of career success,” said Clyde Seepersad, SVP and GM, Training & Certification at The Linux Foundation. “We are thrilled to partner with Prepr to bring The Linux Foundation’s vendor-neutral, expert training in the open source technologies that serve as the backbone of modern technologies to communities that will truly benefit from it. I look forward to seeing how these promising students perform and hope to partner with Prepr on future initiatives to train even more in the future.”

The program will explore digital career pathways through multiple work-related challenges. These work challenges will bring creative approaches to gaining innovative skills that are invaluable in today’s new normal of remote work and learn while allowing individuals to become more competitive in today’s digital workforce.

Stephen Crawford, MPP for Oakville, speaking about the government’s commitment to supporting youth facing employment barriers: “This government is committed to supporting our youth, notably visible minorities, as they prepare to enter the workforce. The youth of today will be the leaders of tomorrow.” The Ontario government funding for the F.U.N. program is part of a $37 million investment in training initiatives across the province.

Through the program’s blended learning approach, participants will learn how to use Prepr’s signature PIE ® tool, which addresses three essential skills gaps facing the business services sector today: expertise in innovation, project management, and business development (entrepreneurship, sales, and commercialization). At the end of the program, participants will gain a certification, along with 12 weeks of hands-on work experience, which will foster valuable, future-proof skills to secure gainful employment.

The Linux Foundation will also support participants through an introductory course to Linux and related tools: LFS101x: Introduction to Linux. The program will help to develop the digital skills essential for our new normal of work, with beginner-level challenges to fill obvious skills gaps and foster a mentality of problem-solving. With the support of open Linux Foundation resources, these challenges will be an opportunity for participants to ideate and create project solutions ready for real-world implementation.

About Prepr

Prepr provides the tools, resources, and technology to empower individuals to become lifelong problem solvers. Through triangular cooperation between the public and private sectors as well as government, Prepr aims to strengthen the collaboration on challenges that affect individuals, communities, businesses, and infrastructure to create a more sustainable future for everyone.

About The Linux Foundation

Founded in 2000, the Linux Foundation is supported by more than 1,000 members and is the world’s leading home for collaboration on open source software, open standards, open data, and open hardware. Linux Foundation’s projects are critical to the world’s infrastructure including Linux, Kubernetes, Node.js, and more. The Linux Foundation’s methodology focuses on leveraging best practices and addressing the needs of contributors, users, and solution providers to create sustainable models for open collaboration. For more information, please visit us at linuxfoundation.org.

The Linux Foundation has registered trademarks and uses trademarks. For a list of trademarks of The Linux Foundation, please see its trademark usage page: www.linuxfoundation.org/trademark-usage. Linux is a registered trademark of Linus Torvalds.

The post Prepr Partners with the Linux Foundation to Provide Digital Work-Integrated Learning through the F.U.N. Program appeared first on Linux Foundation – Training.

The post Prepr Partners with the Linux Foundation to Provide Digital Work-Integrated Learning through the F.U.N.™ Program appeared first on Linux.com.

Open Source Jobs Remain Secure During COVID-19 Pandemic and More Findings From Linux Foundation and Laboratory for Innovation Science at Harvard Report

Friday 15th of January 2021 04:15:30 PM

A new report from The Linux Foundation and Laboratory for Innovation Science at Harvard (LISH) has found that 56% of survey respondents reported involvement in open source projects was important in getting their current job, and 55% feel that participating in open source projects has increased their salary or otherwise improved their job prospects. The “Report on the 2020 FOSS Contributor Survey” compiled the answers of 1,196 contributors to free and open source software (FOSS), and also found that 81% stated the skills and knowledge gained by working on open source were valuable to their employer.

One highlight of the report was the finding that, “[d]espite the survey being administered during the economic downturn resulting from the COVID-19 pandemic, very few respondents were out of the workforce.” This aligns with our 2020 Open Source Jobs Report from earlier this year, in which only 4% of hiring managers reported they have laid off open source professionals due to the pandemic, and a further 2% furloughed open source staff.

In terms of why these individuals contribute to open source projects, respondents were unsurprisingly most likely to say because they use open source software and need certain features added, so they build and add said features. The next top answers provided some more insight into what motivates these open source professionals though. Those were “I enjoy learning” and “Contributing allows me to fulfill a need for creative, challenging, and/or enjoyable work”. This also aligns with the recent jobs report, where open source pros reported they decided to work in the open source community because “Open source runs everything” and “I am passionate about open source”. Both reports suggested that compensation, while important, is not a dominant source of motivation.

Focusing more on what open source projects can do to be successful, the new report goes on to suggest that, “FOSS projects could also provide some educational materials (such as tutorials or getting started guides) about their projects to help those motivated by a desire to learn.” This gets to the heart of our mission at LF Training & Certification – to make quality training materials about open source technologies accessible to everyone. 

One area of opportunity for projects, employers and open source pros according to the report is around secure development practices. The survey respondents overwhelmingly reported that they spend little time focusing on security issues, despite both the quantity and sophistication of attacks increasing year in and year out, and goes on to suggest that “a free online course on how to develop secure software as a desirable contribution from an external source” may help. LF Training & Certification released just such a training program recently in the form of our Secure Software Development Fundamentals Professional Certificate program created in partnership with the Open Source Security Foundation and hosted by non-profit learning platform edX. The program consists of three courses which can all be audited for free, or those who wish to obtain the Professional Certificate may receive such by paying a fee and passing a series of tests aligned to each course. Employers concerned about software development security issues should consider mandating that staff take training like this, and projects should consider requiring it of maintainers as well.

This is just the tip of the iceberg in terms of the findings of the FOSS Contributor Survey; we encourage you to download and review the full document for ever more insight and recommendations.

The post Open Source Jobs Remain Secure During COVID-19 Pandemic and More Findings From Linux Foundation and Laboratory for Innovation Science at Harvard Report appeared first on Linux Foundation – Training.

The post Open Source Jobs Remain Secure During COVID-19 Pandemic and More Findings From Linux Foundation and Laboratory for Innovation Science at Harvard Report appeared first on Linux.com.

Tips for Starting Your New IT Career in 2021!

Friday 15th of January 2021 04:15:29 PM

2020 was a difficult year for all of us, and for many it continues in 2021. Jobs have been lost, and whole industries have been forced to revamp their entire business models, leaving many out of work or facing new ways of working. While significant challenges remain, think of this as an opportunity to consider a new career in the new year. 

Pick the right path for you

The first thing to consider when looking at moving into an IT career is deciding what area of IT to pursue. The 2020 Open Source Jobs Report found the most in demand position to DevOps practitioners followed by developers. The top areas of expertise being sought by hiring managers are Linux, cloud, and security. While it’s good to consider what skills are in demand, it’s just as important to figure out which subject areas will interest you most. If you find a role that not only offers great career opportunities but that you will also enjoy, you are that much more likely to be successful. Our Career Path Quiz is a great place to start, and can point you in the direction of a technology focus that aligns with your existing interests.

Start with free training to ensure there’s a fit

Before jumping head first into a training and/or certification program, take advantage of free training courses to gain baseline knowledge and also ensure this path is really one you want to pursue. Our Plan Your Training page outlines suggested courses and certifications depending on the subject area you’ve chosen to pursue. Many paths, including System Administration, Cloud & Containers, and DevOps & Site Reliability Engineering all start with LFS101 – Introduction to Linux, which is a good starting point for just about anyone looking to start an IT career. Other popular free courses included LFS151 – Introduction to Cloud Infrastructure Technologies, LFS158 – Introduction to Kubernetes, and LFS162 – Introduction to DevOps & Site Reliability Engineering.

Begin learning about intermediate and advanced topics

Once you’ve selected a path and taken some free courses to confirm it’s right for you, it’s now time to move into intermediate and advanced training courses. The Plan Your Training page is still a great resource as it lists the courses that will be most beneficial to learn about a particular topic area. Keep in mind that you typically will not need to complete every single course in a given area to be ready to begin working; concentrate on ensuring that you have the basic skills needed and you can always come back later in your career to pursue more advanced courses.

Think about certifications

While planning the training courses you wish to complete, keep certifications top of mind as well. Especially for those who are new to IT and do not have past experience to fall back on, holding a certification gives potential employers confidence that you have the skills needed to succeed in a given role. Many Linux Foundation training courses complement and help prepare for specific certification exams, so work both into your learning plan. And we offer certifications for those just starting out, like the Linux Foundation Certified IT Associate (LFCA), in addition to more specialized certifications like the Certified Kubernetes Administrator (CKA). Be sure to take advantage of the digital badges awarded for successfully completing a certification, which can be linked to social media profiles like LinkedIn and also can be independently verified, providing confidence for employers of your skills. The Open Source Jobs Report also found that a majority of hiring managers give preference to certified candidates, so these certifications really can open doors.

More structured options

For those who want a bit more structure and support in achieving their learning goals, we also offer two bootcamps. If you’re just getting started and are interested in pursuing a cloud career, the Cloud Engineer Bootcamp meets all your training and certification needs in one organized package. One major benefit of the bootcamps is they include instructor office hours five days per week, enabling you to actually speak to one of our expert instructors to answer questions and get tips on how to be most successful. 

As we move forward into 2021, countless new career opportunities will be available for those who take the steps to pursue them. Get started today and enroll in training to gain the skills you need to be successful in an IT career, then take those skills and gain the certification to prove it!

The post Tips for Starting Your New IT Career in 2021! appeared first on Linux Foundation – Training.

The post Tips for Starting Your New IT Career in 2021! appeared first on Linux.com.

New, Free Training Course Covering Basics of the WebAssembly Now Available

Friday 15th of January 2021 04:15:28 PM

Introduction to WebAssembly is the newest training course from The Linux Foundation! This course, offered on the non-profit edX learning platform, can be audited by anyone at no cost. The course is designed for web developers, Dweb, cloud, and blockchain developers, architects, and CTOs interested in learning about the strengths and limitations of WebAssembly, the fourth “official” language of the web (alongside JavaScript, HTML and CSS), and its potential applications in blockchain, serverless, edge/IoT, and more. WebAssembly has been rapidly growing in popularity thanks to its security, simplicity and the lightweight nature of the runtime. It is also language-agnostic, being a suitable compilation target for a wide range of modern languages.

The six hour course uses video content, written material and hands-on labs to delve into how WebAssembly runs ‘under the hood’, and how you can leverage its capabilities in and beyond the browser. It also explores a series of potential applications in different industries, and takes a quick peek at upcoming features. Enrollees will walk away from the course with an understanding of what the WebAssembly runtime is, and how it provides a secure, fast and efficient compilation target for a wide range of modern programming languages, allowing them to target the browser and beyond. 

The course was developed by Colin Eberhardt, the Technology Director at Scott Logic, a UK-based software consultancy which creates complex applications for financial services clients. Colin is an avid technology enthusiast, spending his evenings contributing to open source projects, writing blog posts and learning as much as he can.

“WebAssembly is one of the most exciting technologies I have come across for years,” said Eberhard. “Its initial promise was a fast and efficient multi-language runtime for the web, but it has the potential to be so much more. We are already seeing this runtime being used for numerous applications beyond the browser, including serverless and blockchain, with more novel uses and applications appearing each week!”

The course is available for immediate enrollment. Those requiring a verified certificate of completion may upgrade their enrollment for $149. Start gaining skills in WebAssembly today!

The post New, Free Training Course Covering Basics of the WebAssembly Now Available appeared first on Linux Foundation – Training.

The post New, Free Training Course Covering Basics of the WebAssembly Now Available appeared first on Linux.com.