Language Selection

English French German Italian Portuguese Spanish

When installing, do you...

install CD only, no liveCD
18% (100 votes)
livecd first, then install cd
53% (285 votes)
either, doesn't matter
25% (137 votes)
livecd only, no install
4% (19 votes)
Total votes: 541

"scientists to this day are

"scientists to this day are still debating the concept"

Take a look here for some numbers: http://en.wikipedia.org/wiki/Global_warming

Of course we can all bury our heads in the sand and pretend everything is ok.

And yes it's a very misplaced discussion.

USB Flashdrive

I like the USB flashdrive for installs.

Starting with Kubuntu Intrepid Ibex, a Live CD on a USB drive is a simple matter (see Pendrivelinux.com).

Of course, this wasn't one of your poll options.

I have now installed Kubuntu Intrepid Ibex on two laptops from a USB flashdrive. (I also use the flashdrive as a "Live CD").

It's way nice!

Re: USB Flashdrive

Now that Ubuntu includes a graphical "usb-creator" utility right on the live CD, it's very easy, and also includes the option of having a persistent overlay, so changes are kept...for the most part.

(If you've previously run another distro off your USB key - like Slax - that uses syslinux to boot, you may have to repartition your USB stick and set the boot flag before running usb-creator for it to work properly.)

Ubuntu doesn't seem to like my nvidia card very well, though.

When I run Ubuntu from the live CD, X only comes up with a resolution of 800x600 on my 1280x1024 monitor. On closer inspection, /etc/X11/xorg.conf was practically empty. Running "dpkg-reconfigure xserver-xorg" only took me through keyboard selection and quit. Cute. I mounted my openSUSE partition, copied an xorg.conf file over (one using the free "nv" driver), did a Ctrl-Alt-Backspace to restart X, and was in business at 1280x1024.

When running from the USB key, Ubuntu backs up the known-good xorg.conf file and writes its own nearly empty xorg.conf - the one that only gets me 800x600 resolution. (Fortunately, it does back up the good one, which you can then restore.)

You can install the proprietary nvidia driver when running Ubuntu from a USB stick - you just have to restore the known-good xorg.conf file, load the nvidia driver ("modprobe nvidia"), and then restart GNOME, every time you use it.

Please don't degenerate the

Please don't degenerate the thread into a another "Ubuntu this Ubuntu that" thread.

Almost every distro these days offer a USB image along with instructions on how to use it. Some offer it officially, others the community offers the USB image.

USB images offer a big advantage: no need to burn CDs or DVDs. And with the price of todays flashdrives falling and the storage increasing...

Also, if using CD or DVD images people should use CDRW or DVDRW as to not increase our individual and collective carbon footprints.

re: Please

Personally, I find drooling unoobtu fanboys way less annoying then vacuous tree huggers that can't do simple math or physics (excess CD/DVD usage is item 419,342 down on the list of crap that might make a tiny insignificant change in the overall worldwide global warming problem).

Dude what's wrong with using

Dude what's wrong with using rewritable media? Are you allergic to it?

If it has a slight chance of helping out fixing the global warming problem and saving some money over burning tons of CDs just to try a distro for 30 minutes why not?

well

"fixing the global warming problem"

There is no conclusive evidence that there is a global warming problem. Scientists to this day are still debating the concept. So to use a possibly false argument as a basis for technological behavior is rather unreasonable.

There are several other good reasons one could give in promotion of conservation of resources. The theory of global warming isn't one of them.

Also, being a Linux tech site, it really is a misplaced discussion.

To test liveCD, to install either

To test (boot it to see if hardware is recognized, if the artwork is appealing, application selection, etc) a distro to which is unknown to me I prefer the LiveCD.

However if I've already used the distro I prefer the install CD. These are faster, more straight to the point especially when text based.

In fact to actually install the distro I prefer a simple text based install CD has I tend to hate 'point and click' and the keyboard feels much faster to me.

And please regardless of the format always but always place documentation on the media and alert the user of it. And I don't mean stupid help stuff, please go for full fledged documentation like Gentoo's Handbook or Arch documentation with both distros being examples on how to place useful documentation on the install media.

As a side note I love Debian' text mode installer. FreeBSD's text mode installer is also very good as it explains each option in a very clean fashion.

Install choice

Live CD's have their purpose (testing hardware compatibility, fixing partitions, retrieving files, etc.) but they are a pretty poor choice to actually test a distro (way too slow, plus adding/updating/modifying packages can be problematic).

I find to actually test a distro, it's better to install it in a VM. Although there is still a performance hit, it's tiny compared to the LiveCD version, plus you have the full distro (in all it's glory or shame) to play with - all without effecting your primary OS or data.

Preference

Of course I like to check a distro with a live cd but if one is not available I will stall install a distro to check it out. And must say some of the distros I hold in high regard didnt have live cd versions.

Always a live cd. Want to

Always a live cd. Want to see if it will work on my system and get a feel of the distro before I install.

How people choose to intall a distro

I am hoping to see, get an inkling of an idea of how people choose their distro of choice and how much of a role the install media plays in that decision.

Please feel free to elaborate on how the install medium affects your decisions in this area.

Big Bear

More in Tux Machines

Leftovers: Software

Linux and FOSS Events

  • Debian SunCamp 2017 Is Taking Place May 18-21 in the Province of Girona, Spain
    It looks like last year's Debian SunCamp event for Debian developers was a total success and Martín Ferrari is back with a new proposal that should take place later this spring during four days full of hacking, socializing, and fun. That's right, we're talking about Debian SunCamp 2017, an event any Debian developer, contributor, or user can attend to meet his or hers Debian buddies, hack together on new projects or improve existing ones by sharing their knowledge, plan upcoming features and discuss ideas for the Debian GNU/Linux operating system.
  • Pieter Hintjens In Memoriam
    Pieter Hintjens was a writer, programmer and thinker who has spent decades building large software systems and on-line communities, which he describes as "Living Systems". He was an expert in distributed computing, having written over 30 protocols and distributed software systems. He designed AMQP in 2004, and founded the ZeroMQ free software project in 2007. He was the author of the O'Reilly ZeroMQ book, "Culture and Empire", "The Psychopath Code", "Social Architecture", and "Confessions of a Necromancer". He was the president of the Foundation for a Free Information Infrastructure (FFII), and fought the software patent directive and the standardisation of the Microsoft OOXML Office format. He also organized the Internet of Things (IOT) Devroom here at FOSDEM for the last 3 years. In April 2016 he was diagnosed with terminal metastasis of a previous cancer.
  • foss-gbg on Wednesday
    The topics are Yocto Linux on FPGA-based hardware, risk and license management in open source projects and a product release by the local start-up Zifra (an encryptable SD-card). More information and free tickets are available at the foss-gbg site.

Leftovers: OSS

  • When Open Source Meets the Enterprise
    Open source solutions have long been an option for the enterprise, but lately it seems they are becoming more of a necessity for advanced data operations than merely a luxury for IT techs who like to play with code. While it’s true that open platforms tend to provide a broader feature set compared to their proprietary brethren, due to their larger and more diverse development communities, this often comes at the cost of increased operational complexity. At a time when most enterprises are looking to shed their responsibilities for infrastructure and architecture to focus instead on core money-making services, open source requires a fairly high level of in-house technical skill. But as data environments become more distributed and reliant upon increasingly complex compilations of third-party systems, open source can provide at least a base layer of commonality for resources that support a given distribution.
  • EngineerBetter CTO: the logical truth about software 'packaging'
    Technologies such as Docker have blended these responsibilities, causing developers to need to care about what operating system and native libraries are available to their applications – after years of the industry striving for more abstraction and increased decoupling!
  • What will we do when everything is automated?
    Just translate the term "productivity of American factories" into the word "automation" and you get the picture. Other workers are not taking jobs away from the gainfully employed, machines are. This is not a new trend. It's been going on since before Eli Whitney invented the cotton gin. Industry creates machines that do the work of humans faster, cheaper, with more accuracy and with less failure. That's the nature of industry—nothing new here. However, what is new is the rate by which the displacement of human beings from the workforce in happening.
  • Want OpenStack benefits? Put your private cloud plan in place first
    The open source software promises hard-to-come-by cloud standards and no vendor lock-in, says Forrester's Lauren Nelson. But there's more to consider -- including containers.
  • Set the Agenda at OpenStack Summit Boston
    The next OpenStack Summit is just three months away now, and as is their custom, the organizers have once again invited you–the OpenStack Community–to vote on which presentations will and will not be featured at the event.
  • What’s new in the world of OpenStack Ambassadors
    Ambassadors act as liaisons between multiple User Groups, the Foundation and the community in their regions. Launched in 2013, the OpenStack Ambassador program aims to create a framework of community leaders to sustainably expand the reach of OpenStack around the world.
  • Boston summit preview, Ambassador program updates, and more OpenStack news

Proprietary Traps and Openwashing

  • Integrate ONLYOFFICE Online Editors with ownCloud [Ed: Proprietary software latches onto FOSS]
    ONLYOFFICE editors and ownCloud is the match made in heaven, wrote once one of our users. Inspired by this idea, we developed an integration app for you to use our online editors in ownCloud web interface.
  • Microsoft India projects itself as open source champion, says AI is the next step [Ed: Microsoft bribes to sabotage FOSS and blackmails it with patents; calls itself "open source"]
  • Open Source WSO2 IoT Server Advances Integration and Analytic Capabilities
    WSO2 has announced a new, fully-open-source WSO2 Internet of Things Server edition that "lowers the barriers to delivering enterprise-grad IoT and mobile solutions."
  • SAP license fees are due even for indirect users, court says
    SAP's named-user licensing fees apply even to related applications that only offer users indirect visibility of SAP data, a U.K. judge ruled Thursday in a case pitting SAP against Diageo, the alcoholic beverage giant behind Smirnoff vodka and Guinness beer. The consequences could be far-reaching for businesses that have integrated their customer-facing systems with an SAP database, potentially leaving them liable for license fees for every customer that accesses their online store. "If any SAP systems are being indirectly triggered, even if incidentally, and from anywhere in the world, then there are uncategorized and unpriced costs stacking up in the background," warned Robin Fry, a director at software licensing consultancy Cerno Professional Services, who has been following the case.
  • “Active Hours” in Windows 10 emphasizes how you are not in control of your own devices
    No edition of Windows 10, except Professional and Enterprise, is expected to function for more than 12 hours of the day. Microsoft most generously lets you set a block of 12 hours where you’re in control of the system, and will reserve the remaining 12 hours for it’s own purposes. How come we’re all fine with this? Windows 10 introduced the concept of “Active Hours”, a period of up to 12 hours when you expect to use the device, meant to reflect your work hours. The settings for changing the device’s active hours is hidden away among Windows Update settings, and it poorly fits with today’s lifestyles. Say you use your PC in the afternoon and into the late evening during the work week, but use it from morning to early afternoon in the weekends. You can’t fit all those hours nor accommodate home office hours in a period of just 12 hours. We’re always connected, and expect our devices to always be there for us when we need them.
  • Chrome 57 Will Permanently Enable DRM
    The next stable version of Chrome (Chrome 57) will not allow users to disable the Widevine DRM plugin anymore, therefore making it an always-on, permanent feature of Chrome. The new version of Chrome will also eliminate the “chrome://plugins” internal URL, which means if you want to disable Flash, you’ll have to do it from the Settings page.