Exploring the Future of Computing
Updated: 51 min 43 sec ago
When we consider any new features or changes for Steam, our primary goal is to make customers happy. We measure that happiness by how well we are able to connect customers with great content. We've come to realize that in order to serve this goal we needed to move away from a small group of people here at Valve trying to predict which games would appeal to vastly different groups of customers.
Thus, over Steam's 13-year history, we have gradually moved from a tightly curated store to a more direct distribution model. In the coming months, we are planning to take the next step in this process by removing the largest remaining obstacle to having a direct path, Greenlight. Our goal is to provide developers and publishers with a more direct publishing path and ultimately connect gamers with even more great content.
This is a big step for Steam, and will make it incredibly trivial for developers and publishers alike to publish games on Steam.
This update, 2.1.0 alias Iijoki brings major architectural changes to Sailfish OS by introducing Qt 5.6 UI framework, Bluez5 Bluetooth protocol (ready to be deployed for development purposes), basics for the 64-bit architecture and text selection in browser. Included is also a beta level implementation for Virtual Private Networks (VPN) (please read release notes) and the first version of QML live coding support. In addition, 2.1.0 adds bigger fonts to the UI, improves the use of camera and fixes a number of errors, many of which were reported by our developer community.
Maybe I'll get around to updating my Jolla phone and tablet at some point, but I really don't see a reason why. Since I reviewed Sailfish OS and the Jolla phone more than three years ago, nothing has been done to address the elephant in the room. The operating system itself was quite stable, good-looking and full-featured from the beginning, and that has only improved with the constant stream of updates and refinements. However, the application situation is still incredibly dire, and we're all still using the same few applications - updated only very infrequently - that we were using three years ago. Several have even died out.
Instead of investing in attracting developers to write Sailfish applications (the three year old promises of support for paid applications still hasn't been fulfilled, for instance), the company got distracted with crazy projects like the tablet, and investing heavily in making Android applications 'run' on Sailfish. While Android applications do 'run', it's still a slow, frustrating, and utterly jarring experience that's a complete and utter waste of resources. Had they spent even half the effort spent on Android application compatibility on attracting native developers, the platform would be in a far better state.
Jolla proclaimed they wanted to take over the world, but in doing so, lost touch with the very people they should've continued to focus on: open source/Linux-oriented enthusiasts, former Maemo/N900 users. Not a large group of people, of course, but definitely a big enough - and, more importantly, loyal enough! - group of people to sustain a small, community-focused company.
Jolla's CEO Sami PienimÃ¤ki penned a letter to the community about upcoming developments for the company. There's some stuff in there about Russia and tablet refunds.
Android Wear 2.0 is also buggier than it should be, especially given the fact that it had an extended public beta period and its launch was delayed by months. Beyond them taking a long time to launch, it can be hard to tell when an app is actually launching, because the screen will flicker back to the list of apps before it will launch the one you just tapped. The Google Assistant also crashed often, forcing me to repeat my inquiry multiple times (or more likely, I just get frustrated with it and pull out my phone).
The changes and improvements look decent, but if you don't first get the above things right, they're all for naught. When will software makers learn that performance - especially UI responsiveness - is the single most important part of a consumer-oriented device?
Not that it matters to me - for some mysterious reason, these new watches won't be coming to The Netherlands. They're coming to the rest of Europe - just not The Netherlands. The Google Pixel is also still pretty much sold out in the two or three countries where it's supposedly available, with no indication they're ever going to be available elsewhere.
Here's a tip, Google: if you want to be a successful hardware maker, maybe make sure interested consumers can actually, you know, buy your stuff?
Just as Windows' development had become complex and fragmented, so too did the company's internal systems for things like source control, issue tracking, testing, building, code analysis, and all the other tasks that fall under the application lifecycle management umbrella. And just as Windows' development was unified as OneCore, the company has embarked on an effort to unify its ALM and develop what it calls One Engineering System (1ES).
The cornerstone of 1ES is TFS, but for 1ES, the company wanted to do more than just standardize on TFS; it wanted to switch to a single version control system. TFVC, Source Depot, and Git were the obvious contenders, though other options such as Mercurial were also considered. In the end, the company standardized on Git.
Why reinvent the wheel all the time, when you can just use a tool everybody else is already using anyway?
Due to an SSD failure last year, I lost a bunch of my virtual machines, including my Windows 3.11 virtual machine. I don't actually use these for anything, but I like having these old operating systems at my fingertips, in case, I don't know, the world is about to end and the only way to prevent it is to run a very specific Windows 3.11-only application. So, yesterday, I recreated the virtual machine.
This seems like an excellent opportunity to link to the original Windows for Workgroups (Windows 3.11) launch event, from October 1992. I'm not even going to try to characterise or summarise this event, because it's so incredibly Microsoftian and '90s, the English language simply doesn't contain enough words to paint an accurate picture.
I grew up with MS-DOS and later Windows 3.x, so this is a strange, somewhat... Twisted throwback to... Let's call it 'simpler' times.
Augustin Cavalier (also known as Waddlesplash) was a guest on The Lunduke Hour, where he explains a lot about what's been going on with the Haiku project for the last couple of years, and why it's been so long from Alpha 4 to the upcoming Beta 1.
Cavalier goes into Haiku's rather unique package management system, progress on the application front, and tons of other things. Definitely worth a listen.
Business owners in the town of Buea, the capital of the Southwest Region of Cameroon say they are struggling to operate following an internet shutdown that began on January 17.
What did Vizio know about what was going on in the privacy of consumers' homes? On a second-by-second basis, Vizio collected a selection of pixels on the screen that it matched to a database of TV, movie, and commercial content. What's more, Vizio identified viewing data from cable or broadband service providers, set-top boxes, streaming devices, DVD players, and over-the-air broadcasts. Add it all up and Vizio captured as many as 100 billion data points each day from millions of TVs.
Vizio then turned that mountain of data into cash by selling consumers' viewing histories to advertisers and others. And letâs be clear: We're not talking about summary information about national viewing trends. According to the complaint, Vizio got personal. The company provided consumers' IP addresses to data aggregators, who then matched the address with an individual consumer or household. Vizio's contracts with third parties prohibited the re-identification of consumers and households by name, but allowed a host of other personal details - for example, sex, age, income, marital status, household size, education, and home ownership. And Vizio permitted these companies to track and target its consumers across devices.
That's... That's a lot of very creepy spying.
Per Arca Noae's revised release schedule, and as announced at Warpstock 2016, Blue Lion (ArcaOS 5.0) moved into beta testing stage today. The first beta release has been made available to the test team, and we anticipate a rigorous round of installation, modifications, formatting, deletion, disk wiping, and all that other fun stuff which accompanies a healthy beta test.
We do not anticipate a public beta cycle nor are we planning a gamma release or an untold number of release candidates. Instead, we fully expect ArcaOS 5.0 to emerge from beta testing at the end of March and to become generally available at that time.
As mentioned during earlier coverage, ArcaOS is a sort-of continuation of eComStation, since it's founded by several eCS developers who felt eCS had ground to a halt.
The VA2000 is a FPGA based graphics card for Amiga 2000/3000/4000 computers featuring high resolutions and color depth over DVI-D/HDMI. It has a hacker-friendly expansion header for upgrades and custom mods and features a slot for MicroSD cards that can be mounted in AmigaOS.
The YouTube video provides additional insight into the open source graphics card. Interestingly enough, I've been looking into getting my hands on a classic Amiga, but the one I would want - an A3000 or A4000 - are quite hard to come by here in The Netherlands.
This is interesting: it turns out there was a NextStep release for IBM AIX workstations. From the initial, archived press release (via Steven Troughton-Smith):
AIX PS/2 NextStep Environment Version 1.1 is a state-of-the-art graphical user interface and programming environment for AIX workstations, designed to be compatible with the same application programming interface (API) as the NextStep product, Software Release 1.0, provided by NeXT, Incorporated.
AIX PS/2 NextStep Environment Version 1.1 provides icons and menus to facilitate access to system utilities and applications. The AIX NextStep Interface Builder is designed to provide a rich set of well-defined objects and graphical cut-and-paste capabilities for designing and implementing application user interfaces. The Objective-C (3) Compiler provides the benefits of object-oriented programming for developers who choose to design additional objects for the application development environment. AIX PS/2 NextStep Environment can help increase the productivity of programmers and end users.
Steven Troughton-Smith, who has a thing for collecting NEXT/early OS X builds and versions, is now looking for this piece of software history, but not a whole lot can be found about this online. I did ran into a thread in comp.sys.next.advocacy from 1995 in which a Robin D. Wilson sheds some more light onto the fate of this product:
And we ran it on an RS/6000 model 540 (with 63MB of RAM no less) -- it was pretty fast. The thing that killed it is Steve Jobs wanted IBM pay more money for 2.0. They had only _just_ finished porting 1.0 to AIX (it did run on top of AIX -- and there were several hacks made to accomodate it -- but it did run fine). When NeXT was shipping 2.0, IBM felt they wouldn't be able to sell 1.0 (there we some rather dramatic improvements between 1.0 and 2.0). They also didn't want to spend more money on it (as SJ was demanding for 2.0), and they didn't feel like porting 2.0 would take any less time (meaning they wouldn't get done until NeXT released a newer version). All that considered -- IBM abandoned NS.
This wasn't a "bad decision" by SJ (per se), but I can see IBM's view on
this easier than I can see NeXT's...
Steven also stumbled upon a very, very long FAQ about NextStep/AIX, which contains tons of information. This will probably be very hard to find, but for the sake of digital archaeology and preservation, we really need to find it somewhere and preserve it. Absolutely fascinating.
But today's breakthroughs would be nowhere and would not have been possible without what came before them - a fact we sometimes forget. Mainframes led to personal computers, which gave way to laptops, then tablets and smartphones, and now the Internet of Things. Today much of the interoperability we enjoy between our devices and systems - whether at home, the office or across the globe - owes itself to efforts in the 1980s and 1990s to make an interoperable operating system (OS) that could be used across diverse computing environments - the UNIX operating system.
As part of the standardization efforts undertaken by IEEE, it developed a small set of application programming interfaces (APIs). This effort was known as POSIX, or Portable Operation System Interface. Published in 1988, the POSIX.1 standard was the first attempt outside the work at AT&T and BSD (the UNIX derivative developed at the University of California at Berkeley) to create common APIs for UNIX systems. In parallel, X/Open (an industry consortium consisting at that time of over twenty UNIX suppliers) began developing a set of standards aligned with POSIX that consisted of a superset of the POSIX APIs. The X/Open standard was known as the X/Open Portability Guide and had an emphasis on usability. ISO also got involved in the efforts, by taking the POSIX standard and internationalizing it.
A short look at the history of UNIX standardisation and POSIX.
Let's talk about elections! Except not the American ones, but the Dutch elections, coming up in March.
Concerned about the role hackers and false news might have played in the United States election, the Dutch government announced on Wednesday that all ballots in next month's elections would be counted by hand.
We haven't been using electronic voting ever since it was demonstrated the machines were quite easily hackable, but everything higher up in the stack was still electronic - such as counting the paper ballot and tallying up the results from the individual voting districts. The upcoming election will now be entirely done by hand - voting, counting, and tallying, making it that much harder for foreign powers to meddle in our elections.
This switch to full manual voting is taken two days after Sijmen Ruwhof posted a detailed article explaining just how easy it would be to hack our voting process.
Journalists from Dutch TV station RTL contacted me last week and wanted to know whether the Dutch elections could be hacked. They had been tipped off that the current Dutch electoral software used weak cryptography in certain parts of its system (SHA1).
I was stunned and couldn't believe what I had just heard. Are we still relying on computers for our voting process?
Turns out the "security" of the counting machines and software, as well as the practices of everything around it, is absolutely terrible. The article is an endless stream of facepalms - and really shines a light on just how lacklustre the whole electronic part of the process was, and hence provides an interesting look behind the scenes of an election.
In the days after Donald Trump won November's presidential election, immigration and civil liberties advocates began assessing how the new president might carry out his promises to create a registry of Muslims and deport millions of undocumented immigrants. Almost immediately, it became clear the Trump administration would need data, and a lot of it, in order to not only peg people's religious affiliation and immigration status but also allow federal agents to verify their identities and track their whereabouts. Information that could be used for such purposes is collected and stored by a variety of state agencies that issue driver's licenses, dispense public assistance, and enforce laws.
In Washington state, The Verge has learned, Democratic governor Jay Inslee has directed members of his policy and legal staff to work with a handful of state agencies to identify data that could be utilized by Trumpâs deportation officials, and how, if possible, to shield any such information from federal authorities engaging in mass deportation. In California and New York, Democratic lawmakers have proposed legislation to block state data from federal immigration authorities. Democratic legislators have also proposed bills in Washington state, California, New York, and Massachusetts that would prevent state data from being used by federal authorities to build a registry of people belonging to a certain religion.
The Republican party, Trump, and its supporters are avid advocates of states' rights, so I'm sure the Republican Trump regime will welcome these moves with open arms.
So you've installed Haiku from a recently nightly (or sometime soon, the R1 beta) and youâre launching applications from the Deskbar menu (the blue âleafâ menu). Perfect, but there are a few more options to investigate if you want to quickly launch your favourite programs.
Neat little overview. For a second there I thought they were replacing Deskbar, and I nearly had a heart attack.
Apple Inc. is designing a new chip for future Mac laptops that would take on more of the functionality currently handled by Intel Corp. processors, according to people familiar with the matter.
The chip, which went into development last year, is similar to one already used in the latest MacBook Pro to power the keyboard's Touch Bar feature, the people said. The updated part, internally codenamed T310, would handle some of the computer's low-power mode functionality, they said. The people asked not to be identified talking about private product development. It's built using ARM Holdings Plc. technology and will work alongside an Intel processor.
And before you know it, you have a MacBook ARM.
Earlier today, The Irish Times ran an "article" titled "Brussels broke the rules in its pursuit of Apple's â¬13bn". That sounds serious, and would definitely have you click. Once you do, you read an article written by "Liza Lovdahl-Gormsen" without any sources, which is basically an almost word-for-word rehash of letters and answers from Tim Cook about the tax deal. The lack of sources and Tim Cook-ery tone of the piece should set off thousands of huge and loud alarm bells in anyone's mind, but it isn't until the very last paragraph of the "article" that the reader stumbles upon this:
Liza Lovdahl-Gormsen is director of the Competition Law Forum and senior research fellow in competition law. This article was commissioned from her by Apple and supplied to The Irish Times
Pathetic and disingenuous at best, intentionally misleading and ethically reprehensible at worst. The fact that the biggest, richest, and most powerful company in the world has to resort to this kind of unethical behaviour should tell you all you need to know about how certain Apple is of its own claims about the tax deal.
Ever launch an app on your iPhone and then get a pop-up warning that says the app may slow down your iPhone? (I have old versions of certain apps, so it shows up for me every once in a while.) That warning usually appears when you're using a 32-bit app. You can still run the app, and you probably donât even notice the slowdown you've been warned about (at least in my personal experience).
Your ability to run that 32-bit app is coming to an end. As several other Mac sites have reported, Apple has updated the pop-up warning in the iOS 10.3 beta to say that the 32-bit app you're running "will not work with future versions of iOS." The warning goes on to say that the "developer of this app needs to update it to improve its compatibility."
It'd be interesting to know if this actually affects all that many people.
Historically, the code for Chrome for iOS was kept separate from the rest of the Chromium project due to the additional complexity required for the platform. After years of careful refactoring, all of this code is rejoining Chromium and being moved into the open-source repository.
Due to constraints of the iOS platform, all browsers must be built on top of the WebKit rendering engine. For Chromium, this means supporting both WebKit as well as Blink, Chrome's rendering engine for other platforms. That created some extra complexities which we wanted to avoid placing in the Chromium code base.
There is no Chrome for iOS. It doesn't exist. Just because it has a Chrome-like UI doesn't mean it's Chrome. Chrome is the whole package - UI and engine. Without the engine, it's not Chrome. I understand Google wants to leverage the brand recognition, and I know I'm splitting hairs, but until Apple allows competing browser engines, iOS only has one browser, with a bunch of skins.