Language Selection

English French German Italian Portuguese Spanish

Some rant about desktops, evolution and everything..

Filed under

I have been asking myself numerous times before: what does we miss to have the best Desktop? No matter if it runs Windows, or Mac, or Linux, or anything else.

And the answer I have for myself is that we are limited by the fact that we know what should one expect from his average desktop.

So, what do we know about it?

Well, the de-facto standard for the desktops includes a integrated environment, standardized appearance across all the applications, ability to quickly launch applications and switch between them, possibility of working with many documents easily… Possibility to easily work on different types of documents, graphical applications, use the resources located in some distant place over the world-wide network. And all the other small things that became so tightly integrated into our lives that we cannot imagine a computer without that.

And now, most desktop environments and desktop projects have it. We have it in Windows. We have it in Mac. We have it in Ubuntu, Fedora, SuSE, Mandriva (or better – in Gnome, KDE, XFCE and other desktop environments).

But if we consider the other part of the medal, I think that things are not that simple.

More in Tux Machines

Ubuntu 16.10 review: Convergence is in a holding pattern; consistency’s here instead

There's plenty in Ubuntu 16.10 that makes it worth the upgrade, though nothing about Canonical's latest release is groundbreaking. This less experimental but worthwhile update continues to refine and bug-fix what at this point has become the fastest, stablest, least-likely-to-completely-change-between-point releases of the three major "modern" Linux desktops. Still, while the Unity 7.5 desktop offers stability and speed today, it's not long for this world. Ubuntu 16.10 is the seventh release since the fabled Unity 8 and its accompanying Mir display server were announced. Yet in Ubuntu 16.10, there's still no Unity 8 nor Mir. Read more

NVIDIA GeForce GTX 1050 OpenGL/Vulkan/OpenCL Linux Performance

Earlier this week NVIDIA began shipping the GeForce GTX 1050 graphics cards and our first review is of a Zotac GeForce GTX 1050 Mini. A GeForce GTX 1050 Ti Linux review is still coming up plus some other articles looking at performance-per-Watt and other interesting areas for these low-cost Pascal-based GPUs. Here are results of the latest NVIDIA Linux performance compared to the latest open-source AMD Linux driver with various Radeon GPUs. Read more

What you can learn from GitHub's top 10 open source projects

Open source dominates big data. So much so, in fact, that Cloudera co-founder Mike Olson has declared, "No dominant platform-level software infrastructure has emerged in the last ten years in closed-source, proprietary form." He's right, as the vast majority of our best big data infrastructure (Apache Hadoop, Apache Spark, MongoDB, etc.) is open source. Read more


  • Managing OpenStack with Open Source Tools
    Day 2 operations are still dominated by manual and custom individual scripts devised by system administrators. Automation is needed by enterprises. Based on the above analysis, Ansible is a leading open source project with a high number contributions and a diverse community of contributions. Thus Ansible is a well supported and popular open source tool to orchestrate and manage OpenStack.
  • Databricks Weaves Deep Learning into Cloud-Based Spark Platform
    Databricks, a company founded by the creators of the popular open-source Big Data processing engine Apache Spark, is a firm that we've been paying close attention to here at OStatic. We're fans of the company's online courses on Spark, and we recently caught up with Kavitha Mariappan, who is Vice President of Marketing at the company, for a guest post on open source tools and data science. Now, Databricks has announced the addition of deep learning support to its cloud-based Apache Spark platform. The company says this enhancement adds GPU support and integrates popular deep learning libraries to the Databricks' big data platform, extending its capabilities to enable the rapid development of deep learning models. "Data scientists looking to combine deep learning with big data -- whether it's recognizing handwriting, translating speech between languages, or distinguishing between malignant and benign tumors -- can now utilize Databricks for every stage of their workflow, from data wrangling to model tuning," the company reports, adding "Databricks is the first to integrate these diverse workloads in a fast, secure, and easy-to-use Apache Spark platform in the cloud."
  • OpenStack Building the Cloud for the Next 50 Years (and Beyond)
    Two OpenStack Foundation executives talk about what has gone wrong, what has gone right and what's next for the open-source cloud. BARCELONA, Spain—When OpenStack got started in 2010, it was a relatively small effort with only two companies involved. Over the last six years, that situation has changed dramatically with OpenStack now powering telecom, retail and scientific cloud computing platforms for some of the largest organizations in the world.
  • The Myth of the Root Cause: How Complex Web Systems Fail
    Complex systems are intrinsically hazardous systems. While most web systems fortunately don’t put our lives at risk, failures can have serious consequences. Thus, we put countermeasures in place — backup systems, monitoring, DDoS protection, playbooks, GameDay exercises, etc. These measures are intended to provide a series of overlapping protections. Most failure trajectories are successfully blocked by these defenses, or by the system operators themselves.
  • How to assess the benefits of SDN in your network
    Software-defined networking has matured from a science experiment into deployable, enterprise-ready technology in the last several years, with vendors from Big Switch Networks and Pica8 to Hewlett Packard Enterprise and VMware offering services for different use cases. Still, Nemertes Research's 2016 Cloud and Data Center Benchmark survey found a little more than 9% of organizations now deploying SDN in production.