Language Selection

English French German Italian Portuguese Spanish

Microsoft contract win put down to Linux skills shortage

Filed under

Microsoft may find a monopoly on developers will help it maintain its grip on the software market in the face of Linux alternatives.

Redmond is currently touting the roll-out by the Royal Institute of Chartered Surveyors (RICS) of Microsoft's server and back-end technology in its e-commerce driven online portal as a win over Linux, the platform on which the site originally ran.

But Richard Carlson, Head of Business Systems, RICS, said that the decision to go with Microsoft was taken very early on before the job was put out to tender - on the basis that RICS' in-house developers were in the main Microsoft coders.

'There was no religious argument that we went through,' said Carlson. 'As the head of IT I had to make an objective strategic decision.

'Really it was down to we had a lack of skills in-house. We have very good Microsoft skills, but with Linux we were less comfortable.

We had to ask whether our developers can do this under Linux? Do they want this technology? How does it affect their career development?

Carlson said he saw the site upgrade as starting from the ground up and that there was no sense of eschewing Linux in favour of Microsoft: the tender for the contract went out on the understanding the systems would be Microsoft-based in the first place.

He described the original site as a 'brochure' site: a single point solution that simply hosted information for others to view with 'everything in one pot'.

But the new site is vastly more complicated - including contextual areas or zones where users have access to different services depending on their status, complex information gathering tools linked in with Microsoft back-end systems, and e-commerce facilities - essentially to sell RICS books and conference tickets.

Microsoft Commerce Server 2002 and Content Management Server 2002 handle front end processes and content. These have been linked into back-office systems, using Microsoft BizTalk Server 2004

'There were two main functions [for the new site]. One, we needed the provision of monitoring and managing information, and two, our commercial activities. Also we needed to be able to integrate these into our back end systems,' said Carlson.

Existing back end systems were already built using Microsoft SQL 2000 in the main - another reason for the early decision to go with Microsoft for the online system.

Nick McGrath, head of platform strategy for Microsoft in the UK, said to expect a strong dominance of Microsoft-skilled programmers in the future too: 'Most go where there is a mainstream of code,' he said, and described Linux as an 'exceptionally complex technology'.

This could prove a barrier to Linux adoption in the public sector which is being heavily targeted by Linux vendors but which is traditionally a Microsoft shop. Although attracted by access to code and the economic efficiencies of moving to 'free' software due to limited budgets, they may think twice when it comes to adopting new technologies with which their in-house developers are uncomfortable.

Matt Whipp at pcpro.

More in Tux Machines

KDE: KDE Applications 18.04, KDE Connect, KMyMoney 5.0.1 and Qt Quick

  • KDE Applications 18.04 branches created
    Make sure you commit anything you want to end up in the KDE Applications 18.04 release to them :)
  • KDE Connect – State of the union
    We haven’t blogged about KDE Connect in a long time, but that doesn’t mean that we’ve been lazy. Some new people have joined the project and together we have implemented some exciting features. Our last post was about version 1.0, but recently we released version 1.8 of the Android app and 1.2.1 of the desktop component some time ago, which we did not blog about yet. Until now!
  • KMyMoney 5.0.1 released
    The KMyMoney development team is proud to present the first maintenance version 5.0.1 of its open source Personal Finance Manager. Although several members of the development team had been using the new version 5.0.0 in production for some time, a number of bugs and regressions slipped through testing, mainly in areas and features not used by them.
  • Qt Quick without a GPU: i.MX6 ULL
    With the introduction of the Qt Quick software renderer it became possible to use Qt Quick on devices without a GPU. We investigated how viable this option is on a lower end device, particularly the NXP i.MX6 ULL. It turns out that with some (partially not yet integrated) patches developed by KDAB and The Qt Company, the performance is very competitive. Even smooth video playback (with at least half-size VGA resolution) can be done by using the PXP engine on the i.MX6 ULL.

Red Hat Leftovers

Debian Leftovers

  • RcppSMC 0.2.1: A few new tricks
    A new release, now at 0.2.1, of the RcppSMC package arrived on CRAN earlier this afternoon (and once again as a very quick pretest-publish within minutes of submission).
  • sbuild-debian-developer-setup(1) (2018-03-19)
    I have heard a number of times that sbuild is too hard to get started with, and hence people don’t use it. To reduce hurdles from using/contributing to Debian, I wanted to make sbuild easier to set up. sbuild ≥ 0.74.0 provides a Debian package called sbuild-debian-developer-setup. Once installed, run the sbuild-debian-developer-setup(1) command to create a chroot suitable for building packages for Debian unstable.
  • control-archive 1.8.0
    This is the software that maintains the archive of control messages and the newsgroups and active files on I update things in place, but it's been a while since I made a formal release, and one seemed overdue (particularly since it needed some compatibility tweaks for GnuPG v1).
  • The problem with the Code of Conduct
  • Some problems with Code of Conducts

OSS Leftovers

  • Can we build a social network that serves users rather than advertisers?
    Today, open source software is far-reaching and has played a key role driving innovation in our digital economy. The world is undergoing radical change at a rapid pace. People in all parts of the world need a purpose-built, neutral, and transparent online platform to meet the challenges of our time. And open principles might just be the way to get us there. What would happen if we married digital innovation with social innovation using open-focused thinking?
  • Digital asset management for an open movie project
    A DAMS will typically provide something like a search interface combined with automatically collected metadata and user-assisted tagging. So, instead of having to remember where you put the file you need, you can find it by remembering things about it, such as when you created it, what part of the project it connects to, what's included in it, and so forth. A good DAMS for 3D assets generally will also support associations between assets, including dependencies. For example, a 3D model asset may incorporate linked 3D models, textures, or other components. A really good system can discover these automatically by examining the links inside the asset file.
  • LG Releases ‘Open Source Edition’ Of webOS Operating System
  • Private Internet Access VPN opens code-y kimono, starting with Chrome extension
    VPN tunneller Private Internet Access (PIA) has begun open sourcing its software. Over the next six months, the service promises that all its client-side software will make its way into the hands of the Free and Open Source Software (FOSS) community, starting with PIA's Chrome extension. The extension turns off mics, cameras, Adobe's delightful Flash plug-in, and prevents IP discovery. It also blocks ads and tracking. Christel Dahlskjaer, director of outreach at PIA, warned that "our code may not be perfect, and we hope that the wider FOSS community will get involved."
  • Open sourcing FOSSA’s build analysis in fossa-cli
    Today, FOSSA is open sourcing our dependency analysis infrastructure on GitHub. Now, everyone can participate and have access to the best tools to get dependency data out of any codebase, no matter how complex it is.
  • syslog-ng at SCALE 2018
    It is the fourth year that syslog-ng has participated at Southern California Linux Expo or, as better known to many, SCALE ‒ the largest Linux event in the USA. In many ways, it is similar to FOSDEM in Europe, however, SCALE also focuses on users and administrators, not just developers. It was a pretty busy four days for me.
  • Cisco's 'Hybrid Information-Centric Networking' gets a workout at Verizon
  • Verizon and Cisco ICN Trial Finds Names More Efficient Than Numbers
  • LLVM-MCA Will Analyze Your Machine Code, Help Analyze Potential Performance Issues
    One of the tools merged to LLVM SVN/Git earlier this month for the LLVM 7.0 cycle is LLVM-MCA. The LLVM-MCA tool is a machine code analyzer that estimates how the given machine code would perform on a specific CPU and attempt to report possible bottlenecks. The LLVM-MCA analysis tool uses information already used within LLVM about a given CPU family's scheduler model and other information to try to statically measure how the machine code would carry out on a particular CPU, even going as far as estimating the instructions per cycle and possible resource pressure.
  • Taking Data Further with Standards
    Imagine reading a book, written by many different authors, each working apart from the others, without guidelines, and published without edits. That book is a difficult read — it's in 23 different languages, there's no consistency in character names, and the story gets lost. As a reader, you have an uphill battle to get the information to tell you one cohesive story. Data is a lot like that, and that's why data standards matter. By establishing common standards for the collection, storage, and control of data and information, data can go farther, be integrated with other data, and make "big data" research and development possible. For example, NOAA collects around 20 terabytes of data every day.Through the National Ocean Service, instruments are at work daily gathering physical data in the ocean, from current speed to the movement of schools of fish and much more. Hundreds of government agencies and programs generate this information to fulfill their missions and mandates, but without consistency from agency to agency, the benefits of that data are limited. In addition to federal agencies, there are hundreds more non-federal and academic researchers gathering data every day. Having open, available, comprehensive data standards that are widely implemented facilitates data sharing, and when data is shared, it maximizes the benefits of "big data"— integrated, multi-source data that yields a whole greater than its parts.