Language Selection

English French German Italian Portuguese Spanish

Quick Look at Ubuntu 6.06 LTS Release Candidate

Filed under
Linux
Reviews
-s

The highly successful Ubuntu development team released a release candidate of their upcoming version 6.06 desktop operating system. We haven't tested Ubuntu for quite a while and thought it'd be interesting to see how things had changed. We also thought it'd might be of interest to others to see how this release was shaping up.

I downloaded the i386 pc version last night and it came in fairly quickly. The md5sums matched and I burnt it to a cd. This version, perhaps it's the norm for Ubuntu anymore, was an installable livecd. With using "safe graphical mode" the livecd booted into it's trademark brown motif gnome desktop. The wallpaper was slightly different, but very similar to the one found in the last version I tested. These are always nice wallpapers, even if they are brown.

        

The menus aren't overflowing with applications, but it seems most basic categories of software are represented by at least one app. Applications include some tools it calls accessories such as a calculator, dictionary, menu & text editors, character map and a terminal. The games menu seemed the most generous including lots of gnome games to distract one from their work. These include card games, 2D board games, and some old favorites. The graphics menu includes gthumb viewer, the gimp, and xsane scanner suite. Ubuntu correctly detected and identified my scanner which wasn't turned on until seconds before opening the application.

        

Continuing through the menu, we find a subheading for Internet. In that submenu we find Gaim, Firefox, Ekiga phone application, Evolution Mail client, and Terminal Server client. In office we find Evolution again and entries for the various OpenOffice.org 2.0.2components. Under Sound and Video there is Totem, SoundJuicer, Rhythmbox, Serpentine, and a sound recorder. Totem wouldn't play any video files I had in my archive due to missing codecs or plugins, although it did play the example file included with Ubuntu. The music player did nicely.

        

Under Places we find the usual gnome routine. Some of these include Home Folder, Desktop, Computer, Network Servers, and Recent Documents. Under System are Preferences, Administration, and Help. Preferences are usually apps or tools to help one customize their desktop while Administration usually consists of system settings. Synaptic was included in the Administration menu as well as a software preference module that looked like a version selector. The Help menu had links to Ubuntu's website, wikis and such as well as a local reference.

        

All the applications opened and appeared to function well in my limited testing. Performance was very good with fast response and stable functions from every element. I found the livecd to operate very well.

On the desktop one finds an icon named Install. This is their hard drive installer. The harddrive installer is a graphic installer consisting of 6 basic steps. Overall it wasn't complicated and it appeared newbie friendly. I started the installer and was presented with a language selector. Next was a timezone configuration followed by keyboard layout and user setup.

        

After this things weren't as smooth. My case is probably atypical though. The next step starts a partitioner that will let a user setup the partitions. The first screen lets you choose a disk for it to take over or to manually edit partition table. If you chose manually, it will open qtparted. As my disk is already partitioned, I just clicked Forward and was presented with a list of partitions. Within this list is a proposed mount entry as well as the option to reformat any or all of them. It seems to randomly choose a partition for / but one can adjust it for their purposes.

        

Next one sees a summary screen and if it is to your liking, click Install. Then the installer will run a fileystem check on each partition again and attempt to format your partition in ext3. Here I encountered my first troubles. The Ubuntu fsck had problems with and became confused by my unix slices as well as several of my other partitions. Clicking continue in hopes it would just ignore them, it then complained it couldn't format my chosen install partition. At this point the installer exited and did no damage. Opening a console and mkfs.ext3 manually allowed the installation to procede past that rough spot during the second attempt at an install.

        

After it checks all your filesystem, the installer begins installing the system. It was very quick here taking about 10 minutes. Then it installs the grub boot loader without questioning the user for any preferences. I grumbled here, but it's happened before and not unrecoverable. Then it starts removing a lot of files from somewhere and shortly one is presented with a small screen with the choices to reboot or continue using the livecd. I rebooted.

        

I rebooted already prepared for the sight of grub and without fail, grub was there. No boot splash or other system chooser, it was preceded to try and boot ubuntu. Here it just sat blinking a cursor. I gave it quite some time and it never did budge. I grumbled some more at grub and booted the last livecd I burnt before ubuntu which was at the top of my growing stack of cdrs. From there I was able to edit my lilo.conf, adding an entry for ubuntu, and reinstall lilo to my mbr. Upon reboot, again, the boot kernel would not begin. After choosing Ubuntu from my lilo menu, it too just sat there blinking at me. Ubuntu was not going to boot, thus ending my excursion into Ubuntu Linux this morning. As such, I was not able to test Synaptic, the software manager. I don't know if my experience is unique nor if others had better luck as I didn't bother checking around.

In conclusion, I have mixed feelings about my experiences. On one hand the livecd booted with little issue, none if I chose safe graphical, into a fast and stable gnome desktop with sufficient stable applications. The installer is a bit clunky and I experienced some minor glitches. The system not booting is surprizing for a release candidate and I have hopes this is a minor issue or one that isn't common amongst all testers. Ubuntu fans will probably not be disappointed once all the kinks are straightened.

Re: Ubuntu never publish the computers they certified ?

atang1 wrote:

If you knew which platform and components Ubuntu is written on, you world have the template that Ubuntu used to compile all the programs to work on your computer. compatibility list no longer can tell, since too many devices are ambiguos, and need the proper firmware and operating system in the drivers. udev is still underconstruction. You are lucky that your computer can run the livecd; so please let people know the computer specifications.

Oh sorry. It's that amd_64 3700+ on the asus A8V with 1 gig of ram. I used the i386 version. Unless otherwise stated in the article, I always use the regular x86 systems to test for reviews.

atang1 wrote:

In your case, too many partitions, probably caused the installation failure. The question is, will a single partition work? Did Ubuntu have a template that will only work for a certain arrangement such as one windows, and several competing Linux systems or all the ubuntu veriaties?

You reckon? I know it was confused by some of the filesystems on it. The unix slices for bsd clones threw it for a loop and it claimed errors on some other reiser & ext2. Also, an early pre-beta installed and booted. The system was slightly different using an amd 2800+, but still many many partitions including two bsds.

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

More in Tux Machines

Kernel: Kernelci.org, Tripwire, Linux Foundation, R600 Gallium3D

  • Kernelci.org automated bisection
    The kernelci.org project aims at continuously testing the mainline Linux kernel, from stable branches to linux-next on a variety of platforms. When a revision fails to build or boot, kernel developers get informed via email reports. A summary of all the results can also be found directly on the website.
  • Securing the Linux filesystem with Tripwire
    While Linux is considered to be the most secure operating system (ahead of Windows and MacOS), it is still vulnerable to rootkits and other variants of malware. Thus, Linux users need to know how to protect their servers or personal computers from destruction, and the first step they need to take is to protect the filesystem. In this article, we'll look at Tripwire, an excellent tool for protecting Linux filesystems. Tripwire is an integrity checking tool that enables system administrators, security engineers, and others to detect alterations to system files. Although it's not the only option available (AIDE and Samhain offer similar features), Tripwire is arguably the most commonly used integrity checker for Linux system files, and it is available as open source under GPLv2.
  • Open Source Networking and a Vision of Fully Automated Networks
    Arpit Joshipura, Networking General Manager at The Linux Foundation, discussed open source networking trends at Open Source Summit Europe. Ever since the birth of local area networks, open source tools and components have driven faster and more capable network technologies forward. At the recent Open Source Summit event in Europe, Arpit Joshipura, Networking General Manager at The Linux Foundation, discussed his vision of open source networks and how they are being driven by full automation. “Networking is cool again,” he said, opening his keynote address with observations on software-defined networks, virtualization, and more. Joshipura is no stranger to network trends. He has led major technology deployments across enterprises, carriers, and cloud architectures, and has been a steady proponent of open source. “This is an extremely important time for our industry,” he said. “There are more than 23 million open source developers, and we are in an environment where everyone is asking for faster and more reliable services.”
  • R600 Gallium3D Gets Some Last Minute Improvements In Mesa 18.0
    These days when Dave Airlie isn't busy managing the DRM subsystem or hacking on the RADV Vulkan driver, he's been spending a fair amount of time on some OpenGL improvements to the aging R600 Gallium3D driver. That's happened again and he's landed some more improvements just ahead of the imminent Mesa 18.0 feature freeze.

OSS Leftovers

  • Reliance Jio and global tech leaders come together to push Open Source in India
    The India Digital Open Summit which will be held tomorrow at the Reliance Corporate Park campus in Navi Mumbai -is a must-attend event for industry leaders, policymakers, technologists, academia, and developer communities working towards India’s digital leadership through Open Source platforms. The summit is hosted by Reliance Jio in partnership with the Linux Foundation and supported by Cisco Systems.
  • Open-source software simulates river and runoff resources
    Freshwater resources are finite, unevenly distributed, and changing through time. The demand—and competition—for water is expected to grow both in the United States and in the developing/developed world. To examine the connection between supply and demand and resulting regional and global water stresses, a team developed Xanthos. The open-source hydrologic model is available for free and helps researchers explore the details and analyze global water availability. Researchers can use Xanthos to examine the implications of different climate, socioeconomic, and/or energy scenarios over the 21st century. They can then assess the effects of the scenarios on regional and global water availability. Xanthos can be used in three different ways. It can operate as an independent hydrologic model, driven, for example, by scenarios. It can serve as the core freshwater supply component of the Global Change Assessment Model, where multiple sectors and natural systems are modeled simultaneously as part of an interconnected, complex system. Further, it can be used by other integrated models and multi-model frameworks that focus on energy-water-land interactions.
  • “The Apache Way” — Open source done well
    I was at an industry conference and was happy to see many people stopping by the Apache booth. I was pleased that they were familiar with the Apache brand, yet puzzled to learn that so many were unfamiliar with The Apache Software Foundation (ASF). For this special issue, “All Eyes On Open Source”, it’s important to recognize not just Apache’s diverse projects and communities, but also the entity behind their success. Gone are the days when software and technology, in general, were developed privately for the benefit of the few. As technology evolves, the challenges we face become more complex, and the only way to effectively move forward to create the technology of the future is to collaborate and work together. Open Source is a perfect framework for that, and organizations like the ASF carry out a decisive role in protecting its spirit and principles.
  • ​Learn how to run Linux on Microsoft's Azure cloud
  • LLVM 6.0-RC1 Makes Its Belated Debut
    While LLVM/Clang 6.0 was branched earlier this month and under a feature freeze with master/trunk moving to LLVM 7.0, two weeks later the first release candidate is now available. Normally the first release candidate comes immediately following the branching / feature freeze, but not this time due to the shifted schedule with a slow start to satisfy an unnamed company seeking to align their internal testing with LLVM 6.0.
  • Hackers can’t dig into latest Xiaomi phone due to GPL violations
     

    Yet another Android OEM is dragging its feet with its GPL compliance. This time, it's Xiaomi with the Mi A1 Android One device, which still hasn't seen a kernel source code release.  

    Android vendors are required to release their kernel sources thanks to the Linux kernel's GPLv2 licensing. The Mi A1 has been out for about three months now, and there's still no source code release on Xiaomi's official github account.

  • 2017 - The Year in Which Copyright Went Beyond Source Code
    2017 was a big year for raising the profile of copyright in protecting computer programs. Two cases in particular helped bring attention to a myth that was addressed and dispelled some time ago but persists in some circles nonetheless. Many lawyers hold on to the notion that copyright protection for software is weak because such protection inheres in the source code of computer programs. Because most companies that generate code take extensive (and often successful) measures to keep source code out of the hands of third parties, the utility of copyright protection for code is often viewed as limited. However, copyright also extends to the “non-literal elements” of computer programs, such as their sequence, structure and organization, as well as to things such as screen displays and certain user interfaces. In other words, copyright infringement can occur when copying certain outputs of the code without there ever having been access to the underlying code itself.
  • Announcing WebBook Level 1, a new Web-based format for electronic books
    Eons ago, at a time BlueGriffon was only a Wysiwyg editor for the Web, my friend Mohamed Zergaoui asked why I was not turning BlueGriffon into an EPUB editor... I had been observing the electronic book market since the early days of Cytale and its Cybook but I was not involved into it on a daily basis. That seemed not only an excellent idea, but also a fairly workable one. EPUB is based on flavors of HTML so I would not have to reinvent the wheel. I started diving into the EPUB specs the very same day, EPUB 2.0.1 (released in 2009) at that time. I immediately discovered a technology that was not far away from the Web but that was also clearly not the Web. In particular, I immediately saw that two crucial features were missing: it was impossible to aggregate a set of Web pages into a EPUB book through a trivial zip, and it was impossible to unzip a EPUB book and make it trivially readable inside a Web browser even with graceful degradation. When the IDPF started working on EPUB 3.0 (with its 3.0.1 revision) and 3.1, I said this was coming too fast, and that the lack of Test Suites with interoperable implementations as we often have in W3C exit criteria was a critical issue. More importantly, the market was, in my opinion, not ready to absorb so quickly two major and one minor revisions of EPUB given the huge cost on both publishing chains and existing ebook bases. I also thought - and said - the EPUB 3.x specifications were suffering from clear technical issues, including the two missing features quoted above.
  • Firefox 58 Bringing Faster WebAssembly Compilation With Two-Tiered Compiler
    With the launch of Mozilla Firefox 58 slated for next week, WebAssembly will become even faster thanks to a new two-tiered compiler.
  • New Kernel Releases, Net Neutrality, Thunderbird Survey and More
    In an effort to protect Net Neutrality (and the internet), Mozilla filed a petition in federal court yesterday against the FCC. The idea behind Net Neutrality is to treat all internet traffic equally and without discrimination against content or type. Make your opinions heard: Monterail and the Thunderbird email client development team are asking for your assistance to help improve the user interface in the redesign of the Thunderbird application. Be sure to take the survey.

IBM code grandmaster: what Java does next

Reports of Java’s death have been greatly exaggerated — said, well, pretty much every Java engineer that there is. The Java language and platform may have been (in some people’s view) somewhat unceremoniously shunted into a side ally by the self-proclaimed aggressive corporate acquisition strategists (their words, not ours) at Oracle… but Java still enjoys widespread adoption and, in some strains, growing use and development. Read more

Programming/Development: Git 2.16, Node.js, Testing/Bug Hunting

  • Git v2.16.0
    The latest feature release Git v2.16.0 is now available at the usual places. It is comprised of 509 non-merge commits since v2.15.0, contributed by 91 people, 26 of which are new faces.
  • Git 2.16 Released
    Git maintainer Junio Hamano has released version 2.16.0 of this distributed revision control system.
  • Announcing The Node.js Application Showcase
    The stats around Node.js are pretty staggering. There were 25 million downloads of Node.js in 2017, with over one million of them happening on a single day. And these stats are just the users. On the community side, the numbers are equally exceptional. What explains this immense popularity? What we hear over and over is that, because Node.js is JavaScript, anyone who knows JS can apply that knowledge to build powerful apps — every kind of app. Node.js empowers everyone from hobbyists to the largest enterprise teams to bring their dreams to life faster than ever before.
  • Google AutoML Cloud: Now Build Machine Learning Models Without Coding Experience
    Google has been offering pre-trained neural networks for a long time. To lower the barrier of entry and make the AI available to all the developers and businesses around, Google has now introduced Cloud AutoML. With the help of Cloud AutoML, businesses will be able to build machine learning models with the help of a drag-and-drop interface. In other words, if your company doesn’t have expert machine-learning programmers, Google is here to fulfill your needs.
  • Re-imagining beta testing in the ever-changing world of automation
    Fundamentally, beta testing is a test of a product performed by real users in the real environment. There are a number of names for this type of testing—user acceptance testing (UAT), customer acceptance testing (CAT), customer validation and field testing (common in Europe)—but the basic components are more or less the same. All involve user testing of the front-end user interface (UI) and the user experience (UX) to find and resolve potential issues. Testing happens across iterations in the software development lifecycle (SDLC), from when an idea transforms into a design, across the development phases, to after unit and integration testing.