2014-09-08

snydeq writes
Desktop workloads and server workloads have different needs, and it's high time Linux consider a split to more adequately address them, writes Deep End's Paul Venezia. You can take a Linux installation of nearly any distribution and turn it into a server, then back into a workstation by installing and uninstalling various packages. The OS core remains the same, and the stability and performance will be roughly the same, assuming you tune they system along the way. Those two workloads are very different, however, and as computing power continues to increase, the workloads are diverging even more. Maybe it's time Linux is split in two. I suggested this possibility last week when discussing systemd (or that FreeBSD could see higher server adoption), but it's more than systemd coming into play here. It's from the bootloader all the way up. The more we see Linux distributions trying to offer chimera-like operating systems that can be a server or a desktop at a whim, the more we tend to see the dilution of both. You can run stock Debian Jessie on your laptop or on a 64-way server. Does it not make sense to concentrate all efforts on one or the other?"

Re:Hogwash

By jedidiah



2014-Sep-8 17:29

• Score: 5, Insightful
• Thread

One of the big reasons that I like Linux is the fact that it is really just a conventional server OS with a GUI bolted on top. That is not a bad thing. That is a very GOOD thing. That means that there is a solid foundation on top of all of the shiny shiny.

Linux is not Windows.

Linux is not MacOS.

There is no point in mutilating Linux to pander to people that will never appreciate Linux on it's own terms.

That's rather the whole point.

Re:I think this is a good idea.

By Anrego



2014-Sep-8 17:47

• Score: 4, Interesting
• Thread

This is actually a major benefit of gentoo, and one of the reasons I run it on my servers (they are all hobby-ish, I get that gentoo in production is probably a bad idea).

Trying to run a Debian or similar server, you inevitable end up with a bunch of X packages because some random tool comes with a built in GUI and no one bothered to package a non-X version.

It extends even beyond X or no-X. You find yourself with database drivers for all the major (and some minor) databases regardless if you use any of them, and loads of other cruft.

This is obviously part of the tradeoff for a system that just works, but it's annoying when some gnome library breaks the update on a _server_.

As a side note, it's becoming increasingly frustrating to be a non-systemd user. I've had to re-arrange a tonne of packages as stuff switches. I know systemd is inevitable, but I'd like to hold out just a little longer :(

A reason supercomputers and phones use Linux

By raymorris



2014-Sep-8 17:52

• Score: 5, Insightful
• Thread

98 of the top 100 fastest supercomputers in the world run Linux. Most phones also run Linux. See also consumer electronics of all kinds - TVs, routers, webcams, consumer NAS drives ... Linux works everywhere. As Linux has been installed everywhere over the last few years, Microsoft has gone from a monopoly, the 800 pound gorilla, to trying to catch up in order to survive.

There is a reason for this. Linux didn't make any assumptions about what hardware people were going to use next week. Even the architecture could be whatever you anted that day - DEC Alpha, Blackfin, ARM (any), Atmel AVR, TMS320, 68k, PA-RISC, H8, IBM Z, x86, assorted MIPS, Power, Sparc, and many others.
Microsoft built specifically for the desktop, and supported one platform - x86. Suddenly, most new processors being sold were ARM, and screens shrank from 23" to 4". Microsoft could only scramble and try to come up with something, anything tat would run on the newly popular ARM processors, and ended up with Windows RT. Linux kept chugging along because they had never made any assumptions about the hardware in first place. To start maing those assumptions now would be stupid.

We don't know whether smart watches will be all the rage next year, or if cloud computing wll take off even more than it has, or virtualization, or a resurgence of local computing with power, battery-friendly APUs and roll-up displays. To specialize for "dektop" hardware or "server" hardware would be dumb, because we don't know what those are going to look like five years from now, or if either will be a major category. How many people here remember building web sites for WebTV? How well did that pay off, investing in building a WebTV version, then a Playstation version? The sites that faired these changes the best built fluid, adaptive sites that don't CARE what kind of client is being used to view them - they just work, without being tailored to any specific stereotype of some users.

Huh?

By c



2014-Sep-8 17:53

• Score: 4, Insightful
• Thread

I assume that this is yet another click-bait blog-spam article, because I can't imagine that anyone who knows jack about Linux distributions wouldn't be aware that server and desktop variants of various distributions have been and still are done.

More to the point, anyone who wanted it done that way would've or could've already done it. That the more popular distros don't generally make the distinction or don't emphasize it should be taken as a fairly solid answer to the question posed in the headline.

I reject your premise, and substitute my own.

By marphod



2014-Sep-8 18:35

• Score: 3
• Thread

Windows Server and Windows Desktop don't use the same OS? What definition of Operating System are you using here?

They have the same system libraries. They have the same kernel, albeit optimized and configured differently. They support the same APIs, run the same applications, use the same drivers, support the authentication engine, support the same UIs and shells, and use the same package delivery systems. There are differences, but I've yet to see any technical reason why you couldn't turn a Server edition into a Desktop release or vice versa.

As a counterpoint, the Ford Mondeo (4-door/5-door midsized vehicle) uses the same platform as a Land Rover Range Rover Evoque. They have the same frame, many of the same components, and otherwise take advantage of factory line construction and economies of scale. However, in this case, you could at least argue that they have different 'Operating Systems' -- they have some differences which are arguably just optimizations and tuning changes (handling characteristics, consoles, etc.) but others that are physical differences (Seats, load/capacity, etc.). You don't see Ford running out to split the Platform, though. Why? Because it doesn't make sense. There are more things in common at the core than are different, and they can make more products at a lower cost by sharing the core of the car platform. Ford has a dozen or so active car platforms, used by different models across their various brands; most other car makers do similarly.

The author is making one of several possible basic errors.
1) They don't really understand the definition of a Linux distribution (e.g. RHEL v CentOS v TurnKey v XUbuntu v Arch v etc.)
2) They don't really understand the differences between Windows Server and Windows Desktop
3) They don't really understand the definitions of the Linux kernel, GNU/Linux, and the Linux OS
4) They don't really have a grasp of how software is made or how source code is shared
5) They weren't loved enough as a child and are desperately seeking attention.

This is like saying we need to create different compilers for AMD and Intel chips, as they have different architectures. It lacks understanding of the problem and understanding of how to address a solution.

Show more