Language Selection

English French German Italian Portuguese Spanish

Server: OpenShift and Reasons to Scale Horizontally

Filed under
Red Hat
Server
  • “The power of Kubernetes & OpenShift lies not only in the capabilities but also in the broad ecosystem of products”

    Last month, Red Hat announced the general availability of OpenShift Container Platform 3.11 – an important release because it incorporates the first wave of technology from the CoreOS acquisition. We talked to Diane Mueller, Red Hat’s director of Community Development for OpenShift about the importance of this release, their plan to continue innovating both in and around Kubernetes and Operators & more.

  • Exploring Stretch Clusters for Red Hat OpenShift Dedicated

    Red Hat OpenShift Dedicated has evolved as an effective way to consume OpenShift as a managed service in the public cloud. As we continue to collect feedback from customers, partners, and internal users, we’re excited to be able to present some substantial improvements to the offering, effective this month. I want to focus mainly on the new options available for new OpenShift Dedicated clusters, along with new features that are now available for all OpenShift Dedicated deployments.

  • Reasons to Scale Horizontally

    Scaling vertically is also known as “scaling up”, whereas horizontal scaling is known as “scaling out.” So vertical scaling is adding more resources to a single node in a system, and horizontal scaling is the process of adding more nodes to a system.

More in Tux Machines

A brief comparison of Java IDE’s: NetBeans Vs Eclipse

Thinking about entering the world of programming? What better way to enter than through Java and joining a community of over 10 million developers worldwide? Java is one of the most popular programming languages right now. It is an interpreted, object-oriented programming language which is directly supported by major operating systems like Apple, Linux, Windows, Sun etc. Java is a portable programming language meaning a program can be written on one platform and can run on all platforms. Java supports networking (you can use TCP and UDP sockets) and access remote data using a variety of protocols. It also provides the feature of multithreading, which can utilize multiple processors and one of the prime features of Java is garbage collection. In many languages, the programmer is responsible for deallocating memory and it can become a hassle resulting in errors and segmentation faults. Java, on the other hand, has a garbage collector which manages the memory and frees up the memory by destroying objects not in use. To start coding in Java you need to have Java installed, the latest version of Java is 11 but Java 8 is still supported so having any one of these installed will be enough to get you started. Writing a program and compiling it would take some effort as you will have to write the code in a text file and then save it in .java and then have to compile it using terminal, or you can use an IDE and save yourself the time and effort used in this process and get a slew of interesting features. An Integrated Development Environment or IDE for short, is a software application which helps the user to write and compile code easily by providing features like text editing, debugging plugins etc. while providing compilation by the click of one button. Java has many IDEs but two of the most popular ones are NetBeans and Eclipse. Read more

Graphics: AMDGPU, Mesa and Intel

  • AMDGPU Has Late Fixes For Linux 5.0: Golden Register Update For Vega 20, Display Fixes
    There are some last minute changes to the AMDGPU Direct Rendering Manager (DRM) driver for the upcoming Linux 5.0 kernel release. Being past RC7, it's quite late in the cycle but some work has materialized that AMD is seeking to get in ahead of the stable release for improving the Radeon open-source GPU support.
  • Mesa 19.1 Panfrost Driver Gets Pantrace & Pandecode Support To Help Reverse Engineering
    Since being added to Mesa 19.1 at the start of this month, the Panfrost driver has continued speeding along with bringing up this ARM Mali T600/T700/T860 open-source graphics driver support. The latest batch of code was merged overnight, including support for some reverse-engineering helpers.
  • Intel's Shiny Vulkan Overlay Layer Lands In Mesa 19.1 - Provides A HUD With Driver Stats
    As some more exciting open-source Intel Linux graphics news this week besides their new merge request to mainline the Iris Gallium3D driver, over in the Vulkan space they have merged today their overlay layer that provides a heads-up display of sorts for their Linux "ANV" driver. Last month we reported on Intel developing a Vulkan "heads-up display" for their driver to display various statistics to help the driver developers themselves as well as application/game developers. This is akin to Gallium HUD but suited for Vulkan usage rather than OpenGL.
  • Intel Iris Gallium3D Driver Merged To Mainline Mesa 19.1
    Well that sure didn't take long... Less than 24 hours after the merge request to mainline the Intel "Iris" Gallium3D driver was sent out, it's now been merged into the mainline code-base! The Intel Gallium3D driver is now in Mesa Git for easy testing of their next-generation OpenGL Linux driver. Making the day even more exciting for Intel Linux users is this driver's landing comes just minutes after the Vulkan overlay layer HUD was merged for Intel's ANV open-source driver.

today's howtos

Linux Foundation: Mobile World Congress 2019, LF Deep Learning Foundation and Calico/CNCF

  • MEDIA ADVISORY: The Linux Foundation to Participate in Mobile World Congress 2019
    The Linux Foundation, the nonprofit organization enabling mass innovation through open source, will be onsite at Mobile World Congress 2019, February 25-28, in Barcelona, Spain.
  • Ericsson Joins Linux Foundation Deep Learning Group As Premier Member
    The LF Deep Learning Foundation (LF DL), a Linux Foundation that supports and sustains open source innovation in artificial intelligence (AI), machine learning (ML), and deep learning (DL), announces Ericsson has become the newest Premier Member. Ericsson, a global leader in delivering ICT solutions, has been at the forefront of communications technology for 140 years. Ericsson has already begun contributing to the LF Deep Learning Foundation through the Acumos project, working with partners like AT&T, Orange and the broader community to solve complex problems surrounding 5G and IoT through AI and ML. In addition to participating in LF DL, Ericsson is also a member of LF Networking, DPDK, the Cloud Native Computing Foundation and LF Edge Foundation. Ericsson is strongly committed to these future-forward technologies, and to that end the company has built a Global AI Accelerator focused on tackling the complex business problems of today and tomorrow.
  • The Calico cloud
    Calico, which is now a Cloud Native Computing Foundation (CNCF) project, can be used on many clouds. It supports such common cloud APIs as Container Network Interface (CNI), OpenStack Neutron, and libnetwork. Besides Kubernetes, it can also be used with Docker, Mesos, and Rkt. You can natively deploy Calico on Amazon Web Services (AWS), Google Compute Engine, and the IBM Cloud. You can’t use Calico directly on Azure, but you can use Calico policies with the right network setup. You can get started with Calico today. If you need help and support to get Calico into production, Tigera, Calico’s corporate backer, offers service level agreements (SLAs).