Short bio: Computer Scientist, FOSS supporter (read more)
Tux Machines (TM)-specific
Valve has doubled down on its commitment to make Linux gaming an arena to take seriously by joining The Linux Foundation. “Joining the Linux Foundation is one of many ways Valve is investing in the advancement of Linux gaming. Through these efforts, we hope to contribute tools for developers building new experiences on Linux, compel hardware manufacturers to prioritize support for Linux, and ultimately deliver an elegant and open platform for Linux users,” said Mike Sartain of Valve. If anyone can deliver that long awaited platform its Valve, which has found much success in the gaming world on other platforms.
For some time now, there have been signs that a renaissance for Linux gaming is on the way. In October, I reported on comments from Lars Gustavsson, creative director for EA Digital Illusions CE (DICE), the Electronic Arts studio that does the Battlefield series (seen in the screenshot above). Gustavsson told Polygon that DICE would love to delve into Linux games, and had this to say about the ever elusive "killer game:"
"We strongly want to get into Linux for a reason. It took Halo for the first Xbox to kick off and go crazy — usually, it takes one killer app or game and then people are more than willing [to adopt it] — it is not hard to get your hands on Linux, for example, it only takes one game that motivates you to go there."
Indeed, all it takes is one good console and one killer game to give rise to a gaming phenomenon. If you still don't take Linux gaming seriously, check out this video featuring Valve's Gabe Newell and Linux Torvalds talking about Linux and games.
Valve ported its popular Steam platform to Linux early this year, and its Steam Machines will be based on SteamOS as they arrive next year. SteamOS is Valve's very own Linux distribution. In joining The Linux Foundation, Valve will be able to connect better with developers and resources that can advance its platform.
Valve already has a whopping 65 million active accounts and games from hundreds of developers, so it stands a good chance of getting developers and gamers interested in its Steam Machines, especially if they can come in at low prices. We'll get a good sense of the possibilities as 2014 starts.
It was back in September that the Google Chrome team put an extensive post up heralding "packaged apps" that work with Chrome, which the team obviously felt could become a huge differentiator for Google's browser. "These apps are more powerful than before, and can help you get work done, play games in full-screen and create cool content all from the web," wrote the Chrome team. Since then, if you're a Chrome user, you may have tried some of these apps and experienced how they make the browser feel almost like an operating system underlying applications.
Now, there are lots of reports coming out that claim that Google plans to help developers port Chrome apps to other operating systems, ranging from Apple's iOS to Android. Some of the reports say we'll see the basic architecture of this concept arrive in January.
According to The Next Web:
"Google is working on bringing Chrome packaged apps from the desktop to the mobile world. The company is currently building a toolkit to help developers create Chrome apps for Android and iOS, as well as port their existing Chrome apps to both mobile platforms. The news comes by means of a GitHub repository we stumbled on called Mobile Chrome Apps led by Michal Mocny, a Software Developer at Google."
Unlike Android apps, Chrome apps are built around web standards, so enabling Chrome apps to straddle operating systems will require some technical gymnastics. ReadWriteWeb reports that "Google is apparently using Apache Cordova—the open source core of PhoneGap—to perform the task."
That may or may not be the case, but there is no reason to doubt that Google would like to pull this feat off. Attracting app developers is everything on mobile platforms, but Apple has done an insanely good job of making it most lucrative for developers to build apps for iOS. If Google can help usher in a world of easy app development for "hybrid apps" that can work across operating systems, that could enable developers to reach larger audiences. It might make many of them more friendly to Android.
You can get a sense of how the Chrome team is doing deep thinking about the plumbing of packaged apps in this post. Just as extensions were part of how the Firefox browser grew its audience over the case of many years, packaged apps can become differentiators for Chrome, and, apparently, they could do double duty across operating systems.
In late November, I covered a debate that has been ongoing over whether OpenStack has emerged as a successful cloud computing platform in terms of actual deployments, or whether it is overhyped and immature. The post was a response to an online note from Gartner Research Director Allessandro Perilli, who came out with argumetns that paint a fairly gloomy picture of actual OpenStack deployments.
Bryan Che from Red Hat then delivered a blog post that challenges Perilli, where he made the point that open source projects are not the same as products. In several areas, though, Che agreed with Perilli. Now, Alan Perkins, the CTO of Rackspace Australia is being widely quoted on some retorts to the Perilli note.
Like Bryan Che, Perkins told Computerworld that the key thing to focus on is that many enterprises intend to deploy OpenStack in the coming months. In an interview, he said:
"These are early days, but the shift from Infrastructure-as-a-Service to Platform-as-a-Service is another sign of the maturity of the fundamentals and it will lead to an increase in adoption at the enterprise level. We anticipate a substantial increase in adoption of the OpenStack platform in the coming 6-12 months. We are at that point now where OpenStack is airborne. It has moved on from its initial stages of development to real delivery and enterprise adoption. An onward process of analysis and refinement will now strengthen the platform further. Projects that are easy to get off the ground typically have a limited life. Projects that are more all-encompassing (like the enterprise adoption of OpenStack) take more time to get off the ground, but once launched will be all pervasive and long lasting and that is what will determine its ultimate success."
Red Hat's Che noted the following in his post:
"I agree with Alessandro’s assessments of the challenge for vendors to bring OpenStack to the enterprise. Enterprises haven’t adopted OpenStack widely today. But, that’s not because they aren’t interested. Indeed, a recent survey that IDG Connect conducted of 200 enterprise IT decision makers on Red Hat’s behalf found that 84% planned to adopt OpenStack."
IDG Connect's survey isn't the only one pointing to extremely widespread intent to develop around OpenStack in enterprises. The OpenStack Foundation recently released the results of a broad users survey it did, complete with an infographic. Its survey found that cost savings, operational efficiency and the attraction of an open platform are key reasons why many enterprises are focused on OpenStack. The foundation's survey generated 822 responses and involved 387 existing OpenStack cloud deployments across 56 countries.
It's looking more and more like OpenStack's really rigorous field tests will happen on a widespread basis next year. Much more data is available here.
A lot has changed on the open source scene since the first FLOSS Survey launched way back in 2002. That survey provided one of the early, comprehensive views of open source developers and development communities around the world, and results from it were widely cited in academic publications, in the press and elsewhere. Now, more than a decade later, researchers are launching an updated version: FLOSS Survey 2013. Over 1,000 respondents are already participating and the survey closes on December 6.
You can take the survey here. The researchers who have launched it supply the following information:
"The questions in this survey are almost the same as the ones in 2002, although some minor changes have been introduced, as community and circumstances have changed during these last years. So, for instance, now we do not only target developers, but also all contributors to FLOSS projects."
The results of the survey will be pubished in an open access journal, and a version of the results will become available through a Creative Common-licensed research report.
You can review the findings of the 2002 survey here. The updated survey is likely to point to changes in the open source development community. Communities are more robust and organized now, and there are more proven paths for steering open source projects toward commercial models. OStatic will follow up on results of the new survey.
Those aren't my words. Linux Journal's Readers' Choice Awards 2013 found that GNOME 3 was the Best "Worst" Linux/Open Source Idea to come down the pike. Of course the other categories may be of interest as well. These include Best Distribution, Best Linux Tablet, and Best Linux Server Vendor.
Back under the topic of Best "Worst" idea, the readers chose Mir as close third with 17.8% of the vote. Ubuntu going it alone and just Ubuntu in General earned over 20% of the vote. Some other bad ideas include a LibreOffice fork, putting GNU in front of Linux, and creating a new distro instead of contributing (which came in second with 19.5%). GNOME 3 ended up with 19.9% of the vote.
Now Linux Journal had several "best distribution" categories. These are:
Ubuntu took best distribution in the general Best Linux distro and Best Desktop distro with Mint and Arch doing well in both as well. Debian which took second in Best Linux won the Best for Netbooks/Limited Hardware as well as Best for High-Performance computing. Android took Best Mobile OS.
Samsung won the Best Linux Smartphone Manufacture by a wide margin as did Google Nexus in the Best Table category. IBM barely beat Dell for Best Linux Server Vender.
Those are the highlights. See Linux Journal's full write-up for all the gory details.
There have been some interesting developments surrounding Mozilla's Firefox OS platform and smartphones built on it. Alcatel had already delivered its popular OneTouch Fire phone based on the mobile operating system in countries ranging from Germany to Hungary and Poland. Now, the OneTouch Fire is going on sale at low prices in Italy via Telecom Italia. Meanwhile, Geeksphone has been discussing a high-end Firefox OS phone called Revolution that will purportedly run both Mozilla's platform and Android (though users will need to choose one platform).
You can check out the Geeksphone Revolution homepage here. It's light on details, but Geeksphone was among the first makers of Firefox OS phones and has made Android phones as well. CNET has reported on the company's plans to deliver phones with high-end architecture giving users the choice to run Android or Firefox OS.
The Arrival of the OneTouch Fire in Italy illustrates that while Mozilla has stressed that Firefox OS phones will be steered toward developing markets, carriers will take these phones to larger more established markets. The OneTouch Fire is already on sale in Germany and several other markets, and will go on sale in time for Christmas in Italy, according to ZDNet, which is citing initial pricing at €79.90. Other Firefox OS-based phones have been on sale in Spain and other European countries.
Mozilla is, so far, not wavering from its messaging focus on lower-end Firefox OS phones for emerging markets, but phone makers are already concentrating on higher-end versions and carriers will deliver these phones in larger and larger markets.
Although it has been in test mode for a year and already serves well-known players including Red Hat and Snapchat, Google has now officially rolled out its IaaS (infrastructure-as-a-service) Google Compute Engine (GCE) as a commercial service that will compete with Amazon Web Services (AWS) and other platforms. There are new lower prices for using the GCE platform and Service Level Agreements (SLAs) guarantee close to 100 percent availability.
There are some unique aspects to Google Compute Engine, including the fact that Google offers pay-as-you-go pricing billed in 10-minute increments. Google has lowered the price for standard instances by 10 percent. As an example, the price of a standard one core instance is now $0.104 per hour. There are also new 16-core instances for heavy computational needs.
Google had already concentrated Compute Engine on support for CentOS and Debian, both customized in unique ways. Now, developers can choose to use other Linux flavors, and Google will offer 24/7 support for SUSE, FreeBSD and Red Hat Enterprise Linux.
A post from Google added this about persistent storage options:
"Building highly scalable and reliable applications starts with using the right storage. Our Persistent Disk service offers you strong, consistent performance along with much higher durability than local disks. Today we’re lowering the price of Persistent Disk by 60% per Gigabyte and dropping I/O charges so that you get a predictable, low price for your block storage device. I/O available to a volume scales linearly with size, and the largest Persistent Disk volumes have up to 700% higher peak I/O capability. You can read more about the improvements to Persistent Disk in our previous blog post."
Google is putting a lot of emphasis on low prices and a Linux-based strategy with Google Compute Engine, and it has many partners including SaltStack, Wowza, Rightscale, Qubole, Red Hat, SUSE, and Scalr.
You can read much more about GCE, and watch a video, here.
While searching around for an interesting article topic I found several other headlines that just had to be shared. News broke last week of a new Linux worm. Softpedia thinks whatever Fedora can do, Korora can do better. Jack Wallen lists 10 reasons to use Red Hat. And Bryan Lunduke thinks some popular Linux distributions should develop a mobile version.
This isn't exactly new news, but in case you missed it, apparently there's a new worm out in the wild targeting embedded Linux systems. This includes devices such as routers, cameras, and television boxes. It's been covered quite extensively, but Muktware.com has a nice little summary and relevant links. Our own Sam Dean also covered this today as well.
Red Hat 6.5 was recently released and today Jack Wallen compiled a list of reasons company's should use it. Number one on his list is Precision Time Protocol. He must of really ran out of reasons at nine. PTP really sounds like a throw in. But check out the rest which highlights some of the new features of Red Hat 6.5.
In other Red Hat news, Mark Cox published the latest risk report on RHEL, this time for 6.4 to 6.5. He said, "For a default install, from the release of 6.4 up to and including 6.5, we shipped 54 advisories to address 228 vulnerabilities. 3 advisories were rated critical, 18 were important, and the remaining 33 were moderate and low." See the rest of the report for more information.
Softpedia ran a short blurb on the release of Korora 20 Beta today titled, "Whatever Fedora 20 Does, Korora 20 Will Do It Better." I thought such a bold declaration deserved a look. Softpedia reports that "Korora 20 Beta features five versions, for GNOME, MATE, KDE, Cinnamon, and Xfce. It comes with a few default applications such as Adobe Flash, Google Chrome, Google Earth, RPMFusion, and VirtualBox. The distribution is based on KDE 4.11, Kernel 3.11.2, Firefox 25, and GNOME 3.10." See their piece for more.
And finally here are a few bonus links:
Net Applications is out with its latest Internet browser market share numbers, and if you were under the impression that open source browsers are fiercely threatening Microsoft's share with Internet Explorer, think again. At 58.36 percent share, Internet Explorer has actually hit a high for 2013. Firefox's share dropped very slightly to 18.54 percent and Chrome's share rose slightly to 15.44 percent, keeping Mozilla's and Google's browsers in a neck-and-neck race. But Internet Explorer's staying power is notable.
The data from Net Applications reflect November browser usage, which include the first month of availability of IE 11 with Windows 8.1, and the debuts of Firefox version 25 and Chrome 31.
Here's a snapshot of the data:
Internet Explorer has been steadily climbing toward 60 percent market share throughout 2013, which shows the power Microsoft still wields in making its browser easily available on the Windows platform. Many users still don't want to be bothered with getting and using an alternative browser.
As for Chrome, Google has been steadily improving its browser, but it has been close to its current 15.44 percent share for about a year now. It should be noted that the new version 31 of Chrome, which includes many enhancements, is gaining significant market share.
Net Applications gets market share data from monitoring more than 40,000 websites and tracking 160 million unique visitors per month.
For many of us who run Linux, one of the attractions to doing so is being relatively free of security threats and malware. Every once in a while, though, a notable threat does target Linux, and Symantec researchers have ssued an advisory warning of a new worm that targets not only Linux-based computers but many kinds of devices that include Linux, including some routers and set-top boxes. The worm, Linux.Darlloz, exploits a PHP vulnerability to propagate itself.
According to security researcher Kaoru Hayashi:
"The worm utilizes the PHP 'php-cgi' Information Disclosure Vulnerability (CVE-2012-1823), which is an old vulnerability that was patched in May 2012. The attacker recently created the worm based on the Proof of Concept (PoC) code released in late Oct 2013."
"Upon execution, the worm generates IP addresses randomly, accesses a specific path on the machine with well-known ID and passwords, and sends HTTP POST requests, which exploit the vulnerability. If the target is unpatched, it downloads the worm from a malicious server and starts searching for its next target. Currently, the worm seems to infect only Intel x86 systems, because the downloaded URL in the exploit code is hard-coded to the ELF binary for Intel architectures."
It's fairly easy to be infected by this worm and it can infect more than one part of a home or business network.
To protect against the worm, Symantec recommends users take the following steps:
Clement Lefebvre, founder of Mint, today announced the releases of Linux Mint 16 Cinnamon and MATE editions. In separate announcements Lefebvre said, "Linux Mint 16 is the result of 6 months of incremental development on top of stable and reliable technologies. This new release comes with updated software and brings refinements and new features to make your desktop even more comfortable to use."
Among the new or improved features are several performance tweaks. These are:
The software manager also received lots of bug fixes and performance improvements, but users might notice the updated interface first which now features several screenshots for the apps. New artwork and theme tweaks are also apparent. A new USB formatting tool has been introduced and the system foundations also got some improvements. These include:
Basing your new project on open source comes with a host of benefits, and a few risks. The risks are rarely, if ever, technical, but can often be political. When you choose to start a project based on open source tools, as opposed to proprietary solutions that come with a phone number to call when there is trouble, you are telling the company that you are competent enough to be the only support they need. Of course, with open source, you have the support of thousands behind you, but that can be difficult to convince senior management of. You will run into some road blocks, here's how to avoid them and keep the project moving.
Most oppositions to open source solutions are based on fear. Fear of the unknown, fear of being first, and most important, fear of sticking their neck out. The best way to overcome this fear is to point out that your group is not going to be the first ones to implement this setup. I often do research on publicly available infrastructure information about Wikipedia, Mozilla, Google, and Facebook, all of which are major consumers of and contributors to open source. It can be difficult to compare your organization with these, but regardless of size, the basic principles still apply. Being involved with local user groups where you can get in touch with others in your field working locally can also be a great motivator, you can convey that you know others in the area personally that are using open source in a similar way.
After explaining that we are not blazing a new trail, my next stop is to explain how the open source community puts the control of the project back in our hands. Our data remains our data, free from the traps and lock-ins of proprietary solutions. Once the project is moved outside of a particular vendors sphere of influence, and built on repeatable standards, the company is free to move to whichever vendor provides the best price for their service. Occasionally it makes sense for a company to outsource a particular function, but if the function in question is core to the business, the company should own it. Own it in full, from the source to the implementation, and in that ownership comes freedom and security.
IT organizations in large companies often have long running relationships with vendors, third-party companies that "add value" to your purchase in the form of advice, consulting, and discounts. It is in the vendors best interest to keep your company purchasing licenses from the vendors partners, therefore they will balk at the mention of using an unsupported, open source solution. I actually had one laugh at me when I mentioned we were planning on running a new application on CentOS. "Good luck with that." He said. These vendors often have the trust of the IT management, so it may be best to keep them out of the discussion until a conclusion has been reached, and bring them in as one of the last steps for hardware acquisition, if necessary.
It has been my experience so far that the benefits of owning the solution far outweigh the risk of relying on yourself to implement it. In fact, self-reliance breeds more invested, knowledgeable, and competent employees, which in turn makes the entire company stronger.
Probably everyone has heard of the Creative Commons license. It is routinely thought of as allowing or designating free-to-use works, although original owners can actually set restrictions. Well, on November 25 a new version was released that has "incorporated dozens of improvements that make sharing and reusing CC-licensed materials easier and more dependable than ever before."
The announcement states that one of the improvements is "expansion in license scope, which now covers sui generis database rights. Among other exciting new features are improved readability and organization, common-sense attribution, and a new mechanism that allows those who violate the license inadvertently to regain their rights automatically."
Some of the other improvements include
We had ambitious goals in mind when we embarked on the versioning process coming out of the 2011 CC Global Summit in Warsaw. The new licenses achieve all of these goals, and more.
There continue to be many people around the globe who want to be able to use the web and messaging systems anonymously, despite the fact that some people want to end Internet anonymity altogether. Typically, the anonymous crowd turns to common tools that can keep their tracks private, and one of the most common tools of all is Tor, an open source tool used all around the world.
As a matter of fact, some data from earlier this year shows that usage of Tor has doubled in the wake of privacy invasion scandals involving the NSA and others. Now, a group of engineers on the Internet Engineering Task Force (IETF)--people with a fair amount of clout--have asked the architects of Tor to evaluate turning it into an Internet standard. If that happens, anonymity features could be ever present in applications of all kinds as well as the guts of the Internet.
Currently, you have to choose to use Tor if you want to keep your Internet tracks shielded from prying eyes. It is very easy to choose to use it, though. In fact, the Electronic Frontier Foundation recommends that users of the Firefox and Chrome browsers use an extension called HTPPS Everywhere to enable Tor with ease.
But the IETF group calling for Tor to become a default standard within the guts of the Internet has a grander vision of Tor's future. As David Talbot writes in MIT Technology Review:
"If the discussions bear fruit, it could lead to the second major initiative of the Internet Engineering Task Force (IETF) in response to the mass surveillance by the National Security Administration. Already the IETF is working to encrypt more of the data that flows between your computer and the websites you visit (see 'Engineers Plan a Fully Encrypted Internet”)."
"Andrew Lewman, executive director of Tor, says the group is considering it. “We’re basically at the stage of ‘Do we even want to go on a date together?’ It’s not clear we are going to do it, but it’s worth exploring to see what is involved. It adds legitimacy, it adds validation of all the research we’ve done,' he says. On the other hand, he adds: 'The risks and concerns are that it would tie down developers in rehashing everything we’ve done, explaining why we made decisions we made. It also opens it up to being weakened,' he says, because third-party companies implementing Tor could add their own changes."
Reportedly, Tor is a very elegantly architected tool, and some people think it could serve as an exemplary Internet protocol. It would certaintly change the dynamics of the Internet--affecting everyone from advertisers to malware purveyors--if Tor became baked into the Internet itself. Just look at what advertisers had to say to Mozilla when it proposed escalating the privacy protection in the Firefox browser. You can bet that advertisers are going to sound off on Tor proposal, and soon.
There are now officially more permutations of OpenStack than you can shake a stick at. Internap Network Services Corporation recently announced the beta version of its OpenStack-based hybrid, virtualized cloud platform, dubbed AgileCLOUD. "It is the first cloud platform that will fully expose both virtualized and bare-metal compute instances over a native OpenStack API and delivers significant performance, interoperability and flexibility benefits," the company claims. Meanwhile, there is a lot of debate going on over how many organizations are actually deploying OpenStack, and whether its community-driven development model may be a hindrance for the platform.
Internap is a public company with significant resources, and I've made the point many times that support is going to be the big differentiator as players in the OpenStack arena compete with each other, so maybe Internap will have some advantages as this version of AgileCLOUD moves along.
“AgileCLOUD is now 100% OpenStack ‘under the hood,’ which provides an open, interoperable framework that helps us deliver a dramatically more scalable platform,” said Raj Dutt, senior vice president of technology at Internap, in a statement. “Not only can we offer our customers all the benefits of the OpenStack platform, but we’ve implemented existing features that our hybridized customers find valuable – such as bare-metal cloud instances, static IP addresses, Layer 2 VLANs and compatibility with our existing hosting API (hAPI).”
You can watch a video of Dutt talking about AgileCLOUD here: http://www.internap.com/agilecloud-video/
Internap’s AgileCLOUD is a hosted solution offered out of its Santa Clara data center and is in beta. Participants in the AgileCLOUD beta will not be charged for virtualized infrastructure and will receive a $1,000 credit toward the service once it is generally available, according to the company. To sign up, visit http://www.internap.com/agilecloudbeta/.
In a post from a couple of days ago called "Are Voice Commands Headed for Chrome OS?" I covered Chromium guru Francois Beaufort's sneaky but interesting Google+ missive, where he suggested how voice commands might work. Well, just two days later, the Google team has another Google+ note up, and it announces that the company has launched the Google Voice Search Hotword extension for Chrome, which takes the 'OK Google' feature to the desktop for any users of the Chrome browser. The extension is in beta testing, but you can download it now from the Chrome Web Store.
According to the Google+ post:
"It’s that time of year… the in-laws are coming for a tasty Turkey Day dinner. You’re elbow-deep in your turkey, ready to start the stuffing and you need to quickly calculate how many ounces of walnuts are in a cup. This year, rather than stopping midway through to wash your hands and type in a search, you can just speak to your laptop: “Ok Google, how many ounces are in one cup?” Et voila, the cooking can go on. You can also say “Ok Google, set a timer for 30 minutes” so you don’t forget to baste that turkey. To access hands-free search on your laptop, just download the Google Voice Search Hotword extension from the Chrome Web Store: http://goo.gl/PdVTZM (available in English in the U.S.)."
Sorry non-English speakers, this extension doesn't cater to you, but it does signal a new direction from Chrome, and it's likely that the speech recognition features in this extension will get better very quickly.
As I pointed out in a post on Monday, the latest version of Android, KitKat, also includes responsiveness to voice commands, and there is one other interesting thing to take note of: Google recently hired futurist and tech pundit Ray Kurzweil, who is one of the world's leading experts on speech recognition and the pattern recognition science behind it. Kurzweil also specializes in text-to-speech technology, and has brought products to market based on it. Is the new voice extension for Chrome a product of Kurzweil's imagination. Could be.
On Slashdot today, there is an interesting point made about whether Mozilla and Firefox could answer Google with speech recognition for Firefox. The post points out: "Quick, someone wire Pocketsphinx up to Firefox."
If you're unfamiliar with Pocketsphinx, it's an open source toolkit for speech recognition, and the Mozilla team could indeed leverage it to bring simple voice commands to Firefox. Are we soon going to be talking to our browsers?
Some interesting headlines have jumped out of the RSS reader in the last couple of days. Phoronix.com reported that a KDE developer is missing jeopardizing the whole Kdenlive project. The new OpenMandriva and openSUSE releases received reviews. And MakeTechEasier.com has a rundown of "Distros for Old Computers."
KDE Developer Missing
Phoronix's Michael Larabel reported recently that Jean-Baptiste Mardelle suddenly stopped posting to his blog, answering mail(ing lists), or applying commits. Mardelle was the project lead for the Kdenlive video editor project. Mardelle's last post was on Wednesday, July 3, 2013. No one really knows anything, all that's available are questions. Where is Mardelle? Is the project dead?
OpenMandriva Lx 2013.0 Review
The Navy Christian posted a review of the recently released OpenMandriva Lx 2013.0. He tested it on his MacBook Pro and had issues that may or may not be the result of the test bed. However, overall, it's an interesting read where one gets a good introduction to the new community Mandriva distribution.
Hands on with openSuSE 13.1
Jamie Watson at ZDNet.com recently posted of his experiences with openSUSE 13.1. Calling it "another outstanding release," Watson tested on several hardware configurations. He said, "Everything works, as usual. I have come to expect this from openSuSE, and I was not disappointed."
A few days later Watson shared his further adventures in openSUSE. Most interestingly, he tested btrfs on an ancient laptop where he reports "it works perfectly."
Distros For Old Computers
Mayank Sharma of www.maketecheasier.com published an article today running down several choices for old or lower resource personal computers. He said, "Depending on the age of your hardware, you can revive it with a number of distros." Sharma speaks of Crunchbang, antiX, Puppy, and several others. Be sure to check that out.
Bonus: Say Hello to Elementary OS
Wired.com ran a piece on Elementary OS and its development team recently. "There are myriad Linux distros, but Elementary OS is different: It’s intended for desktop PCs, which are still very much the domain of Apple and Microsoft."
If you travel internationally with a portable computer and depend solely on WiFi to stay connected, you may want to look into some of the cheap data plans you can get now for SIM solutions that pop right into your device and keep you online from almost anywhere. One of the best liked providers of these SIM cards for international data plans is Maxroam, which provides nearly ubiquitous connectivity in more than 200 countries. Now, the company has just delivered a set of SIM cards and roaming plans exclusively for users of Google Chromebooks.
Maxroams plans for Chromebook users are detailed here, and are promoted as providing contract-free connectivity at low costs. According to Maxroam:
"Maxroam for Chromebook means always-on connectivity when you are on the move."
"Ok, but what does that mean exactly? Well it's simple really. Your Chromebook uses Wi-Fi when it can. Once you leave a Wi-Fi zone, your Maxroam for Chromebook SIM kicks in and you'll stay connected. Perfect for when you are on the move...We have two SIMs to choose from and you can top up with a range of data passes for the UK, Europe and the Rest of the World. Best of all, it's prepaid so you will always be in control of your data costs."
While I haven't used the service with a Chromebook, I have used Maxroam's SIMs and plans with other laptops and they do provide solid connectivity from nearly anywwhere.
The Maxroam for Chromebooks SIMs work in 55 countries, ranging from most European countries to Canada. You can also use them in China and Japan. The prices are cited at about 99p a day, so the plans are not expensive. You can investigate the two types of available SIMs and plans here.
As this year draws to a close, Android qualifies as one of the biggest technology success stories of all. More market research numbers are pouring in showing that Android has rapidly spreading usage and influence. Recently, numbers from researchers at IDC showed that Android has blown past 80 percent market share in the smartphone arena, with a total base of 211.6 million smartphone units shipped during the third quarter of this year.
Now, researchers at Canalys are out with a report that forecasts that tablet devices will make up 50 percent of the PC market next year, with tablets based on Android accounting for a hefty 65 percent of ablets shipped, or 185 million devices.
According to the Canalys research report, Samsung will stay dominant in Android tablets, though its share will decline next year:
"Android-derived operating systems will be responsible for driving growth in the market and are forecast to take 65% share in 2014 with 185 million units. Samsung continues to lead with strong year-on-year growth coming from its broad tablet portfolio, and in Q3 2013 it had a 27% share of Android tablet shipments. But with hundreds of small-to-micro brand vendors in established and high-growth markets and international players such as Acer, Asus, Lenovo, and HP, this market share statistic will also start to decline."Android-derived operating systems will be responsible for driving growth in the market and are forecast to take 65% share in 2014 with 185 million units. Samsung continues to lead with strong year-on-year growth coming from its broad tablet portfolio, and in Q3 2013 it had a 27% share of Android tablet shipments. But with hundreds of small-to-micro brand vendors in established and high-growth markets and international players such as Acer, Asus, Lenovo, and HP, this market share statistic will also start to decline. - See more at: http://www.canalys.com/newsroom/tablets-make-50-pc-market-2014#sthash.vikk1j1b.dpuf Android-derived operating systems will be responsible for driving growth in the market and are forecast to take 65% share in 2014 with 185 million units. Samsung continues to lead with strong year-on-year growth coming from its broad tablet portfolio, and in Q3 2013 it had a 27% share of Android tablet shipments. But with hundreds of small-to-micro brand vendors in established and high-growth markets and international players such as Acer, Asus, Lenovo, and HP, this market share statistic will also start to decline. - See more at: http://www.canalys.com/newsroom/tablets-make-50-pc-market-2014#sthash.vikk1j1b.dpuf
The report also forecasts that Microsoft will take 5% of the tablet PC market in 2014, up from just 2% in 2012. “To improve its position it must drive app development and better utilize other relevant parts of its business to round out its mobile device ecosystem,” said Canalys Research Analyst Pin Chen Tang, in a statement. “A critical first step is to address the coexistence of Windows Phone and Windows RT. Having three different operating systems to address the smart device landscape is confusing to both developers and consumers alike.”
Indeed, one of the reasons for Android's success is that Google has stayed focused on it without making missteps like trying to combine Android development with Chrome OS development.
And what lies ahead for Apple, according to Canalys? "Apple has maintained its top vendor position throughout 2013, and the launch of the iPad Air and new iPad mini will strengthen that position in Q4," the firm reports, also noting that Apple has consistently made money from the tablet boom. At every turn, Apple has focused more on its products and profits than simply growing market share. That should continue next year, according to the report.hat Microsoft will take 5% of the tablet PC market in 2014, up from just 2% in 2012. - See more at: http://www.canalys.com/newsroom/tablets-make-50-pc-market-2014#sthash.vikk1j1b.dpuf
Running top can give you a good high level overview of the overall health of your server at the time you are looking at it. One of the most useful statistics presented is the %Cpu line, which is split into eight sections, each representing a possible state of a task using CPU resources. In my previous article on using top, I briefly mentioned three of the eight sections I glance at when troubleshooting a server. Today, I'd like to take a closer look.%Cpu(s): 26.0 us, 7.4 sy, 0.0 ni, 66.6 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st
To understand what this is telling us, let's take a look at each section in turn.
us - User Processes
Most likely, this is what the server is here to do. User processes are normal programs, and in the case of your server, the services the machine is providing. It is important to note that these processes have not had their priority altered by nice, those processes are handled later.
sy - Kernel Processes
System memory is split into two areas, user space and kernel space. Processes running in user space have no direct access to hardware, and can not interfere with processes running in kernel space. Processes that run in kernel space have full access to the entire machine, including direct access to the hardware, so any errors at this level tend to cause an entire system crash.
ni - User processes affected by nice
Some processes are more important than others. For example, you never want your backup program to affect response time of a production web server. In this case, you may want to set the priority of the web server to -20, the highest priority, and the priority of the backup program to 20, the lowest. If any processes on your system have been affected by nice, they will show up here.
id - Idle
The CPU had some free time, so it spent it catching up on a few good books, doing a little fishing, maybe taking a nap in a hammock tied between two palm trees. A server with lots of idle time is underused, it could do more work without problem.
wa - I/O Wait
As I covered previously, I have found that keeping a close eye on I/O wait can give you a heads-up when there is trouble brewing. Perhaps you have a busy database on a shared SAN environment, it is possible that the shared disk may be overloaded by the database, and becomes unresponsive for milliseconds at a time. In this case, the CPU will set the processes that are waiting on a response from the disk aside, in a waiting state. Not doing anything useful, but still taking up valuable resources. Ideally, this should always be 0.0.
hi - Hardware Interrupts
According to Wikipedia "A hardware interrupt is an electronic alerting signal sent to the processor from an external device, either a part of the computer itself such as a disk controller or an external peripheral." A high percentage of hardware interrupts may point towards faulty hardware. My first stop after top would probably be a quick check through dmesg, followed by syslog to see if anything had been logging errors.
si - Software Interrupts
Also according to Wikipedia: "A software interrupt is caused either by an exceptional condition in the processor itself, or a special instruction in the instruction set which causes an interrupt when it is executed." I have not come across an issue where a server was showing a high amount of software interrupts.
st - Stolen Time
If your machine is running in a virtualized environment, as my Ubuntu server is, you may come across a non-zero percentage here in the "stolen from this vm by the hypervisor" section. This is basically time that the host machine needed from the CPU and took from the guest. Ideally, this is another that should always be zero, but if you find yourself with high numbers here, you may need to move some virtual machines off of the host.
Top does a good job of condensing a lot of information down into a readable format. Taking the extra time to read through the man pages and online documentation shows just how powerful this often overlooked tool really is.