Language Selection

English French German Italian Portuguese Spanish

Ubuntu 5.04 Review/Install

Filed under
Reviews
Ubuntu

Can it really be run on PC and Mac platforms?
By TGodfrey

I recently received multiple copies of Ubuntu’s latest distribution, ‘The Hoary Hedgehog’. UbuntuLinux’s website claims "Ubuntu" is an ancient African word, meaning "humanity to others". Ubuntu also means "I am what I am because of who we all are". The Ubuntu Linux distribution brings the spirit of Ubuntu to the software world. I do not think this can be anymore true plus my wife thinks its just fun to say.

I received live and installation CD’s for Intel x86, AMD64/EM64T, and PowerPC. The organization believes that anyone who asks, shall receive the distribution. I think this is a great way to get an easy-to-use version out there for those that want to use their computer, not be a slave to it. Ubuntu believes in productivity and ease of use in an operating system regardless of the hardware you own (or what was stuck in a closet). Most of the calls I receive are from customers that have 2-4 year old hardware, so I thought the best place to see if the distro is usable is on an older P3 and Apple iMac.

I am a newly granted Novell CLP (Certified Linux Professional), have been teaching classes in linux administration for quite awhile, and genuinely just like the ‘openness’ of a wonderful operating system. The best way to introduce linux to your customers, friends, business associates, etc. is just to give them a live CD and let them play with it. I make a lot of copies of live CD’s, but this one has most of the items my associates are looking for. It needs to be easy to use, easy to configure, have some business applications, web browsing, ability to connect to their local network, and print. Ubuntu does this with such ease it is almost scary.

Let’s take a look at Test#1 and Test #2…

Test #1: PC Installation

PC Installation on an IBM PIII-600mhz, 128mb RAM, 15gb hard drive, and Encore 54g wireless card. This installation took less than a half hour even with configuring the printer and wireless. A word to the wise, it is so much easier to configure the wireless through the PC’s NIC (I did take it out later). My wife wanted to replace our network printer (Epson Stylus Scan 2500 connected to our SuSe Linux 9 server) with a standalone Brother MFC-5840CN printer/fax/scanner/copier/etc. This is a great productivity device for way under $200 and extremely easy to configure.

Initial Installation

Boot from CD and run with the install. The interface is a no-nonsense series of text screens. Just fill in the answers as requested. Ubuntu will want the entire hard drive by default, so make sure to back up any important information before starting the installation. There are ways of setting up multiple partitions, but with this box only having 15gb, it can have the whole thing.

The installation found the correct monitor, mouse, keyboard, and NIC so installation was very quick. I was able to connect to my network drives (SuSe), other shared drives (Mac OS/X and other non-linux), and the internet with no problems. The installation did not see my wireless card or Brother printer automatically. It did see our old Epson without issue and was able to print to it.

The only think I don’t like is that you cannot login as ‘root’ normally. It just proves we get spoiled and really should use the computer with another login. Only administer the computer with the ‘root’ privileges. This is easy enough to fix by:

1. go to TERMINAL
2. type: sudo passwd root
3. type in the password you used when first logging in
4. exit TERMINAL
5. Select: [System] – [Administration] – [Login Screen Setup]
6. Choose: [Security] tab
7. Check the ‘Allow root to login with GDM’ box, then [CLOSE]
8. Close out, reboot, login….

Wireless Configuration

This is easy to do by following the steps from the Ubuntu website. My Encore 54g wireless card is based on the RT2500 Chipset. I just queried the site and it came up with : www.UbuntuLinux.org/wiki/Rt2500WirelessCardsHowTo. Please make sure to use a PCI NIC for easier configuration of the card. You can remove it later. I’ll summarize some of the important steps from Mr. Rob Sharp’s instructions.

1. go to TERMINAL
2. type: wget http://rt2x00.serialmonkey.com/rt2500-cvs-daily.tar.gz
3. type: tar -xzf rt2500-cvs-daily.tar.gz
4. type: sudo apt-get install build-essential linux-headers-$(uname -r)
5. type: cd ./rt2500-cvs-*/Module
6. type: make
7. To test, type: sudo insmod rt2500.ko

Exit TERMINAL

1. Select: [System] – [Administration] – [Networking]
2. Find the wireless card, then Select  [Activate]

To finish up the install

1. go to TERMINAL
2. type: sudo ifdown ra0
3. type: sudo cp ~/rt2500-cvs-daily/Module/rt2500.ko /lib/modules/`uname -r`/kernel/drivers/net/wireless/
4. type: echo "alias ra0 rt2500" | sudo tee /etc/modprobe.d/rt2500

The issue now is that if you restart the computer then you have to do a ‘sudo ifup ra0’ to bring up the wireless card. Mr. Sharp’s instructions go on stating the user can update the /etc/network/interfaces/ file by defining the IP, subnet, SSID, a wireless key (if any), etc. The last line would be ‘auto ra0’ to bring up the card when the machine is turned on. All of the above took less than a half-hour and has been working fine for well over a month. I pulled the internal NIC a little later that evening and have not looked back since.

Printer Configuration

This was a little more tricky. Ubuntu does not support this printer on the CD so you will have to access the Brother Solutions site and download the LPR and CUPS drivers (http://solutions.brother.com/linux/sol/printer/linux/cups_drivers.html). I just followed the instructions:

1. Download the LPR driver
2. Download the CUPS/Wrapper driver
3. Basically: rpm -ivh --nodeps drivername
4. Launch your web browser, go to: http://localhost:631
5. Manage / Add / Configure the printer (just follow the screens)

I then used the above to change over my SuSE machines and Fedora 3 box from the Epson to the Brother printer. The Epson will probably go on eBay since it is no longer being used. All has been working well for the past several weeks.

Test #2: PowerPC trial

I really like Macintosh since playing with the different flavors of OS/X. My G3 desktop died a couple of years ago and the only thing I owned with an apple icon is my iPod. I started going through the Mac websites looking for an inexpensive one that did not require much desk space. Like most of you, I’m sure you share an office with multiple computers doing all sorts of ‘important things’….that’s what my wife hears anyway. I’m sure she really didn’t want me dragging home yet another machine.

After almost two weeks, I found an Apple iMac G3 (‘bondi blue’….was really hoping for ‘tangerine’…), 350mhz, 128mb RAM, and 6gb hard drive on eBay for under $100 with shipping. It was delivered a few days later and found not to contain a keyboard or mouse. I reread the eBay listing and it did state that the little guy would not be coming home with a keyboard or mouse.

Great.

I didn’t want to wait so I went to the nearest ‘computer place’ and purchased a Kensington wireless keyboard and mouse for under $60. I figured it could be used with another computer in case it did not work. The installation is basically plugging it in and turning on the Mac. OS/X came up with no problems and recognized the new Kensington hardware with no issues. I then booted the Mac with the Ubuntu PowerPC live CD (remember to hold down the “C” key) and it came up flawless! This distro configured the monitor, sound, wireless keyboard/mouse, and saw my network through the built-in port. Fantastic!

The screen looks identical to the PC version and works just as well. Since the machine is running from a live CD, it does run slow since the iMac does not have a very fast CD drive. A bigger hard drive will be found in the near future so a proper installation of Ubuntu can be on the iMac. I guess it would be making the little guy into a uMac?

In conclusion, Ubuntu is a great all-around version of linux. It is easy to load and configure, has many useful business applications, seems to understand just about any type of hardware, and just plain ‘works’. When business associates, customers, and friends ask about trying linux, I can see making many copies of the live CD to get them hooked. Then it just a matter of time before they start asking me for installation CD’s. Linux for human beings….I can see that.



X86 Version on the left; PowerPC on the right

More in Tux Machines

Android Leftovers

Linux on the mainframe: Then and now

Last week, I introduced you to the origins of the mainframe's origins from a community perspective. Let's continue our journey, picking up at the end of 1999, which is when IBM got onboard with Linux on the mainframe (IBM Z). These patches weren't part of the mainline Linux kernel yet, but they did get Linux running on z/VM (Virtual Machine for IBM Z), for anyone who was interested. Several efforts followed, including the first Linux distro—put together out of Marist College in Poughkeepsie, N.Y., and Think Blue Linux by Millenux in Germany. The first real commercial distribution came from SUSE on October 31, 2000; this is notable in SUSE history because the first edition of what is now known as SUSE Enterprise Linux (SLES) is that S/390 port. Drawing again from Wikipedia, the SUSE Enterprise Linux page explains: Read more

OSS: Cisco Openwashing, GitLab Funding, Amazon Openwashing, Chrome OS Talk and More Talks

  • Why Open Source continues to be the foundation for modern IT

    Open source technology is no longer an outlier in the modern world, it's the foundation for development and collaboration. Sitting at the base of the open source movement is the Linux Foundation, which despite having the name Linux in its title, is about much more than just Linux and today is comprised of multiple foundations, each seeking to advance open source technology and development processes. At the recent Open Source Summit North America event held in San Diego, the width and breadth of open source was discussed ranging from gaming to networking, to the movie business ,to initiatives that can literally help save humanity. "The cool thing is that no matter whether it's networking, Linux kernel projects, the Cloud Native Computing Foundation projects like Kubernetes, or the film industry with the Academy Software Foundation (ASWF), you know open source is really pushing innovation beyond software and into all sorts of different areas," Jim Zemlin, executive director of the Linux Foundation said during his keynote address.

  • GitLab Inhales $268M Series E, Valuation Hits $2.75B

    GitLab raised a substantial $268 million in a Series E funding round that was more than doubled what the firm had raised across all of its previous funding rounds and pushed its valuation to $2.75 billion. It also bolsters the company’s coffers as it battles in an increasingly competitive DevOps space. GitLab CEO Sid Sijbrandij said in an email to SDxCentral that the new Series E funds will help the company continue to move on its goal of providing a single application to support quicker delivery of software. It claims more than 100,000 organizations use its platform. “These funds will help us to keep up with that pace and add to that with our company engineers,” Sijbrandij explained. “We need to make sure every part of GitLab is great and that CIOs and CTOs who supply the tools for their teams know that if they bet on GitLab that we’ll stand up to their expectations.”

  • Amazon open-sources its Topical Chat data set of over 4.7 million words [Ed: openwashing of listening devices without even releasing any code]
  • How Chrome OS works upstream

    Google has a long and interesting history contributing to the upstream Linux kernel. With Chrome OS, Google has tried to learn from some of the mistakes of its past and is now working with the upstream Linux kernel as much as it can. In a session at the 2019 Open Source Summit North America, Google software engineer Doug Anderson detailed how and why Chrome OS developers work upstream. It is an effort intended to help the Linux community as well as Google. The Chrome OS kernel is at the core of Google's Chromebook devices, and is based on a Linux long-term support (LTS) kernel. Anderson explained that Google picks an LTS kernel every year and all devices produced in that year will use the selected kernel. At least once during a device's lifetime, Google expects to be able to "uprev" (switch to a newer kernel version). Anderson emphasized that if Google didn't upstream its own patches from the Chrome OS kernel, it would make the uprev process substantially more difficult. Simply saying that you'll work upstream and actually working upstream can be two different things. The process by which Chrome OS developers get their patches upstream is similar to how any other patches land in the mainline Linux kernel. What is a bit interesting is the organizational structure and process of how Google has tasked Chrome OS developers to work with upstream. Anderson explained that developers need to submit patches to the kernel mailing list and then be a little patient, giving some time for upstream to respond. A key challenge, however, is when there is no response from upstream. "When developing an upstream-first culture, the biggest problem anyone can face is silence," Anderson said. Anderson emphasized that when submitting a patch to the mailing list, what a developer is looking for is some kind of feedback; whether it's good or bad doesn't matter, but it does matter that someone cares enough to review it. What the Chrome OS team does in the event that there is no community review is it will have other Chrome OS engineers publicly review the patch. The risk and worry of having Chrome OS engineers comment on Chrome OS patches is that the whole process might look a little scripted and there could be the perception of some bias as well. Anderson noted that it is important that only honest feedback and review is given for a patch.

  • Open Source Builds Trust & Credibility | Karyl Fowler

    Karyl Fowler is co-founder and CEO of Transmute, a company that’s building open source and decentralized identity management. We sat down with Fowler at the Oracle OpenWorld conference to talk about the work Transmute is doing.

  • What Is Infrastructure As Code?

    Rob Hirschfeld, Founder, and CEO of RackN breaks Infrastructure As Code (IaC) into six core concepts so users have a better understanding of it.

  • Everything You Need To Know About Redis Labs

    At the Oracle OpenWorld conference, we sat down with Kyle Davis – Head of Developer Advocacy at Redis Labs – to better understand what the company does.

Programming: Java, Python, and Perl

  • Oracle Releases Java 13 with Remarkable New Features

    Oracle – the software giant has released Java SE and JDK 13 along with the promise to introduce more new features in the future within the six-month cycle. The Java 13’s binaries are now available for download with improvements in security, performance, stability, and two new additional preview features ‘Switch Expressions’ and ‘Text Blocks’, specifically designed to boost developers’ productivity level. This gives the hope that the battle of Java vs Python will be won by the former. Remarking on the new release, Oracle said: “Oracle JDK 13 increases developer productivity by improving the performance, stability and security of the Java SE Platform and the JDK,”. [...] Speaking of the Java 13 release, it is licensed under the GNU General Public License v2 along with the Classpath Exception (GPLv2+CPE). The director of Oracle’s Java SE Product Management, Sharat Chander stated “Oracle offers Java 13 for enterprises and developers. JDK 13 will receive a minimum of two updates, per the Oracle CPU schedule, before being followed by Oracle JDK 14, which is due out in March 2020, with early access builds already available.” Let’s look into the new features that JDK 13 comes packed with.

  • 8 Python GUI Frameworks For Developers

    Graphical User Interfaces make human-machine interactions easier as well as intuitive. It plays a crucial role as the world is shifting.

  • What's In A Name? Tales Of Python, Perl, And The GIMP

    In the older days of open source software, major projects tended to have their Benevolent Dictators For Life who made all the final decisions, and some mature projects still operate that way. Guido van Rossum famously called his language “Python” because he liked the British comics of the same name. That’s the sort of thing that only a single developer can get away with. However, in these modern times of GitHub, GitLab, and other collaboration platforms, community-driven decision making has become a more and more common phenomenon, shifting software development towards democracy. People begin to think of themselves as “Python programmers” or “GIMP users” and the name of the project fuses irrevocably with their identity. What happens when software projects fork, develop apart, or otherwise change significantly? Obviously, to prevent confusion, they get a new name, and all of those “Perl Monks” need to become “Raku Monks”. Needless to say, what should be a trivial detail — what we’ve all decided to call this pile of ones and zeros or language constructs — can become a big deal. Don’t believe us? Here are the stories of renaming Python, Perl, and the GIMP.

  • How to teach (yourself) computer programming

    Many fellow students are likely in the same boat, the only difference being that the vast majority not only that don’t list computer science as one of their passions (but more as one of their reasons for not wanting to live anymore), but they get a very distorted view of what computer science and programming actually is.

    Said CS classes tend to be kind of a joke, not only because of the curriculum. The main reason why they are bad and boring is the way they are taught. I am going to address my main frustrations on this matter together with proposed solutions and a guide for those who want to start learning alone.

  • [Old] Perl Is Still The Goddess For Text Manipulation

    You heard me. Freedom is the word here with Perl.

    When I’m coding freely at home on my fun data science project, I rely on it to clean up my data.

    In the real world, data is often collected with loads of variations. Unless you are using someone’s “clean” dataset, you better learn to clean that data real fast.

    Yes, Perl is fast. It’s lightening fast.