- Linux vs mac windows memes
- Linux vs Mac vs Windows
- Primary Sidebar
- Linux vs. Windows Internet Battle No Longer Exists Because Linux Is Winning
- Featured Video
I still love Ubuntu, but all the offerings of Windows 10 has kind of made me a Windows fanboy and even makes MacOS seem like a decisive downgrade. I know security is hard and i like to skirt it sometimes as well, but i still think it should be pointed out that a fingerprint should not be used as a password for security critical data. Its at best a username, as you can't change it and a motivated person can trivially steal it. Not trying to discourage you from using it like that. Its perfectly fine as long as you realize that the fingerprint is only secure against random people on the street or just not very competent attackers All files accessible from windows will have for example, there are a few applications that have issues with that.
It's sort of an after-install thing that takes a minute, but I have some keyboard shortcuts that allow me to do things like jump into a sketching program with the current clipboard contents. And Krita itself and a few other good apps are available on Linux. Not sure how Throttlestop compares to the latest Linux options. I've got many containers going, three instances of vscode, a zillion tabs, performance is not an issue. But, I realize not everyone cares about a free and transparent world, even when it's more and less as good.
It must at least encourages companies like Microsoft to keep opening up and getting better. It absolutely is fine as a developer. But my parents would never be able to get accustomed to Ubuntu or any other distro. It was hard enough to make them use email. The truth is that the vast majority of people just want things to work.source url
Linux vs mac windows memes
Like turning on a TV without any setup. OECD studies have shown that more people than one thinks are incapable of using search in email . But, I wasn't talking about your parents. That's true: some things are easier on Mac. In particular, I'd say that using software that hasn't been packaged for your distribution is much easier on Mac. However, your example of fonts is definitely not one of those areas, anymore. Font rendering on Linux is as advanced and capable as any other OS including in the areas of kerning and hinting.
It should just work without any user intervention and look great. And this article is a perfect example of that. And we shouldn't be telling new Linux users to try anything but those two. Many engineers fiddle with various "hacker" window managers like i3, Sway, Awesome, or fvwm. The authors example of messing around with dmenu because "unix philosophy" is an example of this kind of waste of time that people eventually get tired of because they have better things to spend their time on.
The only advantage I find in LXDE is that it has a titlebar I can 'reveal' if I wish to mess around and fiddle with the window position like when tracking the arping replies in a remote lan and using a bit of scripting to see the evolution of the metric as I fiddle with things I like my desktop lean and mean. I do not want distractions. When I am dealing with a remote system crashing under load, the last thing I want is my desktop or my shortcuts to behave in weird ways.
Things must always work, in a consistent way. Funny thing is I can only get that in Linux Customization is a feature, just not everyone needs that feature. Some people try that, too and XFCE. Admittedly, they're better than a tiling window manager, but LXDE's last stable release was and it doesn't have a compositor so all of that graphics hardware in your computer specialized to prevent X11 DAMAGE events from forcing applications redraws and for saving power is going unused.
GNOME's one certainly doesn't, in my experience. It would spin my fans up after a few moments of dragging a window around. Of course it has no configuration options so it is not possible to fix it. I agree ; I'm considering moving to wayland but still now LXDE does everything I need well enough to not have bothered for the last few years. Yeroc 6 months ago. I'm curious what you find distracting in Gnome? Unless you're running in Classic mode the newer versions got rid of everything except the top menu bar and that is less busy than it used to be.
The default launcher works very well using the keyboard only, I actually quite like it though there are definitely other things I don't like about the Gnome defaults. What I find distracting: the menu bar indeed, the file manager on the desktop, the title bar of the windows, the buttons on the titlebars of the windows That doesn't leave much of gnome.
No need to reveal the title bar except perhaps to read the occasional title. Correct, I really like alt left click to move windows but for some reason I prefer resizing with the title bar even if takes one more event keypress to make it show up, and one more to remove it when I'm done.
Off topic, but have you looked at LXQt? It is surprisingly nice, albeit with slightly more memory usage. Interesting - I will consider it, thanks a lot for the suggestion! Memory use is a concern, but so are wakeups in a tickles kernel on a laptop. At the moment I'm considering Sway mostly because of the wide use and community, but it's a "long term" project this month or the next Users have different needs and preferences, so I see the fragmentation as a positive thing because it gives people choice.
For me personally, xmonad, one of these hobby projects, has been perfectly sufficient for the last 10 years. More importantly though, I find it actually reduces my mental workload since I no longer have to handle window placement myself. DJHenk 6 months ago. I have had i3 on every computer I use for a couple of years now and I don't think I will ever go back to a conventional window manager. No matter the thing I'm doing, every useful or efficient placing of windows is always no more than one or two keystrokes away. Symmetry 6 months ago. Eh, I've been running xmonad within Gnome since checks git blame and I don't think I'm changing things anytime soon.
Well, for the first 2 years I fiddled a lot and made sure I could just configure a computer the way I liked with a script. Occasionally I'll have an issue like Ubuntu switching to Gnome 3 but even that, the biggest disruption in years, just took an evening of fiddling to fix. Hello71 6 months ago.
This isn't entirely true. However, these are usually toggleable via the GUI, and even without them it usually still looks fine. Since Mojave, MacOS nolonger does subpixel rendering. Klonoar 6 months ago. On non-retina displays, which On all displays. It matters less on Retina displays. External displays are nowhere near the normal use case? No, my point is that this change only affects non-Retina displays. In my experience worldwide, many companies, etc anyone who uses a Mac with an external monitor generally doesn't settle for some POS. It's a high-end screen that matches the MacBook, hence why it's not that big of a deal.
The change affects all displays. The highest supported resolution on a MacBook is still scaled down. Only a couple of expensive LG displays match the actual density as far as I know. The MacBook Air only got a Retina display a few months ago. The low-end iMac is still p.
Subpixel order is pretty universally standardized. Is there an EDID data element for pixel order? Wowfunhappy 6 months ago. I didn't know that! How does sub-pixel rendering work on CRT's, which to my understanding don't have a set matrix of pixels and subpixels? It worked like shit, blurring perfectly fine text. At least the last time I had a CRT, which was very early 00's. Of course OSX is more user friendly now , but Linux desktop has improved by leaps and bounds.
I think in a few years you will see the Linux diaries continue in popularity, especially among developers. Laptops have become commodity items. People have been saying that ever since, uhm, the first Mandrake release? As soon as auto-configuring XFree was kinda figured out, out goes XFree and in comes x. Out goes Xorg, in comes Wayland. Gnome 2 worked out the kinks? Time for Unity! Kde 4 finally getting snappy? Init systems figured out? ALSA getting adoption? Pulseaudio finally working? And so on and so forth, in an endless churn. By the time it gets fixed, it will be time to replace it.
And so the experience is a perennial struggle against half-finished, unpolished software. Yes the churn is an issue especially in desktop environments but relatively painless all things considered. I don't think there has been a single inflection point for me there has been steady incremental improvement. If you want to think about how far things have come I started using Linux in with Mandrake.
Around 2. So much has changed since the bad old days. I don't want to throw out a "back in my day we walked up hill in snow both ways" style rant but People complain about changes like ASLA, pulse audio etc but I think there is a lot of rose tinted glasses being applied to how things were before. Sure some things aren't perfect but neither was their predecessors and on the whole they fixed more things then they broke. Never going to happen. The reality is, people get paid more to develop for the Mac world and put more effort in consequentially, whereas Linux is still mainly volunteer-driven.
I'm thinking specifically of basically any time you need to use Mac-only software. IMHO, all of the major modern operating systems are good enough and have been for a while now. Pick the applications you want then find the OS that best supports those applications. It would be nice if the Purism guys would open some retail stores. Once you can walk in some place and get help it gets to be a lot easier to recommend those machines to less tech-savvy relatives. Yes Hollywood is pretty much sold on Maya and Houdini on Linux, but they use their own in-house distributions and have no issues dealing with binary blobs for performance.
Meanwhile my Asus netbook sold with Linux still can't do video decoding on hardware nor OpenGL 4 support, in spite of DirectX 11 class hardware, because AMD decided to reboot their driver development. I see lots of advice online to just use the kernel driver for AMD cards these days, but in my experience it is slow. On some of my machines it benchmarks slower than the Intel graphics.
It is disappointing. I still tend to prefer nVidia graphics when I'm building my own systems because even if the driver is a big binary blob it does work. I don't care much about performance because I mostly play strategy games, my issue with the kernel driver is it's instability, many games will hard crash my system. I would also say that Apple has plenty of rough spots that Linux doesn't as well. Font rendering the Apple way is a style choice at best. Given how much of a hacker's usage of a computer is working with text reading pages, writing code and documentation, taking notes it's a style choice that actually has a significant impact.
By the way, macOS Mojave deprecates subpixel antialiasing — a poor decision when there are many non-hidpi displays still being used. I've used both macbooks and linux and have never noticed a difference, so I have no clue what sort of impact I'm supposed to be noticing. I remember a decade ago when there was more discussion about it the rought situation was: OS X fonts were "blurry" because the font rendering was optimized for stay true to the font shape.
Microsoft was "crisp but kerning broken" because the font rendering optimized for pixel alignment, so less grey pixels but shapes moved slightly to fit pixel borders. Linux was somewhere in between however you configured it. That was the time of subdpi screens though. With dpi Retina screens these days it does not matter much anymore. I actually prefer Windows rendering now to Mac.
My 1Password FireFox extension ubuntu Hey there! Beyer from 1Password here. Thanks for using 1Password! And just like that, we've waded into "cracking open the window manager" just to figure out what's going on. What is your X session? It might be, that your wm is not launching some service that wayland session is and 1password needs. The fact that you're asking this question and suggesting that root cause is itself a significant part of the problem the OP was talking about.
Unfortunately, the openness allows for so much flexibility, that many people end up breaking it up. Invariably, they will blame the system and not their changes. If they used the defaults, like they do in other systems, it would not happen. In this case though LastPass is broken in Firefox on Ubuntu by default.
Edit: akiselev said 1Password, not LastPass. My mistake. Interesting that I experienced this same issue with LastPass. This was a clean install so Gnome shell - worked just fine on Ubuntu After I login through the 1password website, the extension window opens just fine so it might be some input security service, but how would a Firefox extension even have access to a system service like that except through Firefox's built in APIs?
I don't know how 1password works; I'm using keepassxc. With keepassxc you have native application and extension, that communicate together via socket. The native application can use whatever native APIs it wants. However, back to you. What's more weird, if it is pure Firefox Webextension, that Firefox still by default launches as X11 application under Wayland, so the extension should have no way to know the difference.
Someone really needs to make Linux distros just work when hooking a laptop up to a projector. The fact that there's so much trouble that's so public and faced with so much concentrated embarrassment is a serious ongoing PR impediment to Linux. Linux distros got printing licked across the board, so it can happen. How about a Linux version of Airdrop? Last time I hooked up my Linux laptop to a projector it fired up the screen just fine. I didn't even have to hit some magic key combo to turn it on. Time before last when I went to a Golang meetup, the presenter's Linux laptop embarrassed him for a few minutes, and lots of devs with Macbooks made some linux jokes at his expense.
This was the story with printing in Linux as well, back in the day. It would work fine for some, and be a nightmare for others. This is definitely a case where the details matter. The guy who runs Slackware on some no-name laptop is probably going to struggle more than the guy running Ubuntu on the Dell.
It was Bionic Beaver. He was a well heeled SV manager, and he had a fairly new laptop. Not sure if it was a Dell. This also happened with printing. People would say it worked fine for them, then point out that scroungers on quirky old laptops were getting what they deserved. Really, the fault wasn't those quirky old laptops, but rather fragile and not so well standardized software. Thus, for the whole 13 months I worked there, I got to share several laughs about how "it's the year of the Linux desktop, yeah?
I got to share several laughs about how "it's the year of the Linux desktop, yeah? Seems to me it's not been a thing for Windows since before Windows 7. It's long been speculated that one problem with Linux is cultural. Are Linux desktops trapped by cultural expectations? The only piece of this I'm going to engage on is the kernel panic bit, and only as far as a counter-anecdote so take it as you will , but I had almost as many kernel panics on a Macbook Pro over a bit over a year of using it at a past job.
Probably double or triple the frequency if those displays were connected to a full docking station - probably something odd in PCIe code, potentially even firmware level. I had almost as many kernel panics on a Macbook Pro over a bit over a year of using it I'm still rocking my Macbook Pro. It's been solid. I can only stand to do my development on a windows machine by using a Linux VM.
This hasn't been an area where Linux has had problems for more than 5 years. Because I definitely got the impression it was a widespread joke in that circle of devs. Also, when I saw it happen to the last poor sap, it happened on Bionic Beaver. Is this more of the Thermocline of Truth? I think dual headed integrated and discrete GPU laptops using Wayland is still a problem for some people. I don't know why. I gave up on dual headed laptops a while ago and haven't had a problem with multiple displays using i graphics and Wayland. In Linux's defense even the corporate issued Dell laptops at my work running Windows 10 struggle with projectors.
People hit fullscreen in powerpoint and projector display just disappears and have to go through plug in unplug again dance. From everything I've seen a projector "just working" seems to be the exception rather than the rule. My experience with Macbooks has been excellent. The one time I had a rotten experience on a Macbook was when I had to use Zoom.
I honestly feel like I've seen this same comment on every Linux workstation post for years now. And you know what? Its mostly true, but its also true for me in reverse. Mac OS has definitely improved in the last few years For me, this would be just as true of a statement. INTPenis 6 months ago. Everyone has their own view of rough spots but those aren't mine. The rough spots I'm experiencing with Linux are very specific. Also lack of native clients for software like Webex Teams, forcing you to use their web apps which use up so much resources that I'm convinced they've caused my laptop to stall a couple of times.
And of course, perhaps related to the issue above, anything relating to graphics does need work. The major positive thing I can say about using Linux daily in work and personal life is that it works so well that when it fails you get very annoyed. That's a good sign.
It means that it's rare enough to annoy me. If it was too common I wouldn't be surprised when it fails. I also switched back from Mac to Linux, 2 years ago. But I use vanilla Gnome 3 on Fedora. Before Mac I used tiling window managers but now I don't see the point. It's just so much configuration to handle which Gnome does without a single line of config or shell code. Ubuntu and openSUSE are my main installs, and they usually feel fine, my only issue is always drivers, whether it's Wi-Fi or Graphics worse one, since at least Wi-Fi I have ways to circumvent - buy a compatible wireless usb tiny adapter for example, or connect a router to the network and connect my device over LAN.
Nvidias drivers used to work on my one laptop, now they've broken something in the latest incarnation of the driver, that or it's just not compatible with Ubuntu's new X. As for the font thing, maybe I'm not a fontphile or something, but they're usually fine for me. PopsiclePete 6 months ago.
Linux is just consistent. When it works, it works. When it doesn't - god help you. There's no rhyme or reason. Sometimes, my wifi is broken. On a different distro, it's not. Right now, under Ubuntu The is an even more fun can of worms - resuming from suspend will almost always work I remember the having massive problems with this, it's not just you - I ended up selling that rig to a friend , but there's a decent chance I'll kernel panic upon plugging in my Type-C dock after said resume. The Killer Wi-Fi card has none of the connectivity problems I read about almost everywhere for this model I'd definitely not call Linux consistent, but it's better than it was when I started using it back in ish, and I wouldn't trade it for any other setup and it's not for a lack of trying: my primary work machine for 2.
I've also found that on my XPS 13 with Was fine previously on Now I have to open and shut the lid a few times to get it to stop flickering. It's annoying and would otherwise be perfect. Ah yes that. A quick workaround is ctrl-alt-fn-F1 to force the screen to lock itself. Makes the flicker go away. That trick worked perfectly. Thanks very much. For me, Linux has improved a lot because Windows has gotten a lot worst, but the Linux distributions I've used have not improved that much mainly Fedora and Ubuntu. I've been using Linux exclusively for many years and will probably continue to do so though I fell "in love" with the Budgie desktop experience, which is a project of Solus Linux.
So I "found" Budgie Ubuntu. Best of both worlds, IMHO. It's the closest I've gotten to a Mac on Linux. It still has some rough edges, but nothing that really sticks out for day-to-day work and play. If anything, font rendering in Linux is much better than in OSX. It does way more advanced auto-hinting and LCD-subpixel rendering, supports the newest font standards for things like advanced layouts, colored fonts and emojis, etc. There's also very little display fragmentation other than the choice between Xorg and Wayland, which are largely complementary so far.
But you can already be on Wayland-only in many cases.
No, my experience is much more along the lines of, "I download and install Ubuntu or Linux Mint, and then it works without further issues. H1Supreme 6 months ago. A wifi driver was the only thing it needed. Which, ended up being trivial to install. Everything else worked out of the box.
Two finger gestures on the trackpad, audio, everything. My current Dell Inspiron didn't need a single driver installed manually with Ubuntu. Pretty impressive. Font rendering is a funny one. Some distros were limited by patent issues, but Ubuntu has looked great for a while. Now Apple removed subpixel anti-aliasing in This may look better on hidpi, but lots of us still have non-retina Macs.
Ubuntu and it's derivatives have pretty good out of the box font rendering IMO. Yeah that good for MacOS and virtually every other piece of technology. So take a user from Windows 10 or OSX whatever, and sit them in front of your favorite stable Linux distro running a terminal emulator and a browser of your choice. Are the fonts going to be rendered in a way that is unobtrusive for those users, or will things look ugly and difficult to read? Because I'm running a chroot of Debian Buster on my Chromebook and boy, that terminal sure does look blurry.
And is this a bug that will be fixed when Buster stabilizes, or am I expected to go read some font wiki on a different machine to "guess-and-check" it back to sanity? I don't want to switch to OSX. But I also understand why someone wouldn't trust the UX of a system that ships in "headache-mode" by default. Edit: clarification. Anecdotal like other comments but I and a few others from my workplace have switched to Dell XPS machines and run Linux now, and it's because we're working with Docker, even with 16GB RAM and solid-state drives, the performance of Docker on MacOS is pathetic, and entirely unusable for our work.
The XPS machines we have are ugly, they're flimsy, they have a grotesque carbon-fibre pattern on them, the keys leave imprints on the monitor after the lid has been closed, the fan drowns out the sound of music in my apartment, the camera is situated about 5mm above the keyboard with it's rickety keys, it is without a doubt the ugliest, the worst computer I have owned, I eat tramadol just to handle the back-pain from carrying it around in my rucksack, but I'm much less angry and frustrated working on it than I am on a Mac because it doesn't shit it's pants when more than a few containers are running.
I started using X1 Carbon 6th recently and it's an amazing little machine. Extremely portable a 14" that weighs 2. It's also very pleasant to touch and hold in your hands, especially if you like the Thinkpad aesthetics. I have to agree. I was never a big fan of laptops before, but I've really come to enjoy my 6th gen X1. The keyboard is surprisingly nice for a laptop and the battery lasts long enough to not annoy me when I'm on the move. The fingerprint reader doesn't work though, not that I know what I would use one for.
I had to install an extra package and add some lines to a pam config file. I use it to log in, plus to auth for anything that requires root. A prompt appears and asks me to swipe my finger. Counter-anecdote I admit that I'm ignorant about aesthetics, so I'm not addressing those points. I also work at Google where we have a very solid desktop distribution, providing me a working example of almost every tweak I might want to make.
Owning the hardware has been a great experience since two changes. First, I swapped out the Killer wifi card and replaced it with an Intel card. Unfortunately this is table stakes for any machine with that card. I swapped it in, and it indeed restored the light touch that I liked about the other one. I wouldn't expect many people to do this. As for software, I've settled successfully on Ubuntu A key part of my success has been committing to Ansible for configuration management.
When I make a settings change I want to keep, I figure out where it's persisted, and then upstream it into my Ansible repo. This has the three-part benefit of teaching me a little about the system mostly dconf , creating a worklog of changes I've made to the machine, and giving me the psychological comfort that if things get really bad, I can reinstall the OS plus all my tweaks.
Just to be clear, these aren't I've since expanded the Ansible setup to manage a tiny target-practice server I keep on a cloud service, and over the holiday break this year I successfully set up an old desktop in a closet by 1 installing base Ubuntu, 2 unzipping my Ansible repo to it, 3 running a bootstrap script in the repo that installs python, git, and ansible, and runs Ansible on itself, and then 4 drinking a cup of coffee.
By the end I had a machine that was substantially identical to my XPS I'm sure I'm proving someone's point that Linux isn't ready for the desktop. But from my perspective, the Linux-hardware interface is excellent on a Dell XPS 13 because of Project Sputnik, and if you can factor out the likely dozens of hours I spent overengineering my personal Ansible configuration system, I now have a reliable, easily reproducible desktop setup that rivals the manageability of a Chromebook, which is my personal gold standard for desktop statelessness.
All I was trying to say was that these tweaks are user preferences like background images, not essential things to make the machine usable. Thanks to Sputnik and the wifi card swap, the machine was usable from the start. Docker performance outside of Linux is always going to be a problem since it will need to rely on VM. Our development environment has been with docker on Mac for a few years. It's definitely slow, but we found it acceptable. Do you happen to use a very large image with tons of IO work? Have you tried running Linux on a MacBook? OxO4 6 months ago. I have the 13" version and it very much reminds me of a MacBook Pro aesthetically.
Brushed aluminum chasis, dark keys with back lighting , and an overall sleek, understated design. Nothing like the cheap, plastic Dell's I was used to seeing. And, it installed Ubuntu mate without a hitch. Everything worked out of the box except the font scaling. I suspect that's a Mate specific problem though. Gnome may work better. JacobJans 6 months ago.
It seems to me there is always an "except" with Linux. Even someone, elsewhere in this thread, said the developer edition of the XPS worked flawlessly, "except The XPS 15 is half a pound heavier than the MacBook Pro 15, which should be barely noticeable when carried in a backpack. I am not quite following. It's light, works well, and is quiet.
The keyboard leaving imprints on the display is no different from my MBP, whose design everyone still seems to love. I am dual-botting my MBP and I absolutely recommend doing that. Not sure about more recent ones though.
Linux vs Mac vs Windows
I use a Dell Latitude at work. It's far worse than the XPS I've sampled in shops.
The keyboard is the worst thing I've used since the ZX81 and Spectrum. Possibly worse but my memory may fail me. There are different issues with OSX though, that colleagues of mine had, that never occur with Linux.
Almost all my colleagues struggled with setting up python 2 und 3 correctly at some point, whereas this just worked for me on Linux. Some struggled with font rendering between a p monitor and their Retina displays as well. This is one area where I'd have to concede the Apple experience is objectively suboptimal.
However, the Python ecosystem isn't doing anyone any favors here. Languages and platforms generally are best supported on linux. Linux package managers do that for you. AWildC 6 months ago. And a project config system that will break everything at the slightest provocation! Seriously, setting up even moderately complicated multi-project solutions usually results in days lost to figuring out why x library isn't linking with y project.
Nowadays MSVC has a built-in cmake integration a bit like vscode. You can open a folder with CMakeLists. Yes, if you can justify using cmake on Windows. Most customers want an MSVC project though it seems. And then you want to use that library that isn't a msvc project.
Or, God forbid, you want to use clang or gcc. Or intel's whatever. Apple isn't, either. Except you forgot Apple ships Python 2. It's actually quite up to date. My Mac on Mojave ships with Python 2. But in my experience they usually don't keep them up to date very well.
Sounds like RHEL Unless you enable Software Collections. RHEL 8 is moving to Python 3, though, with a Python 2 binary kept for system tools that still need it. Like, really, things where you wonder how possibly could there be a conflict, such as scipy not being able to update because matplotlib ver whatever being needed by some other lib that interoperates with scipy but itself is updatable something something something.
I'm still not sure I know the best way to install Python packages. It seems someone has something bad to say about any given method pkg mgr, pip, venv, etc. I just use nix on OS X, better package manager than most linux distros have. The time investment required for Nix is too high. It's waaaay better than it used to be. A couple of years ago there were a lot of impurities which would cause everything to break whenever macOS was updated. But these days it just uses libSystem from macOS and the rest is managed by nixpkgs. No argument there, I definitely agree, however its easily my favorite package manager.
And lets me keep nixos config similarly. This is at least as bad a problem on Linux — I would argue worse since I've seen people break the system tools written in Python — and it's mostly due to Python being old enough to be both widespread and to have accumulated tons of easily-googled bad advice. I work in an rpm shop, and we've been exploring switching to python. I was looking at venv, but I guess not fully looking through into deployment. Do you know if venv works with setup.
SllX 6 months ago. To be blunt, Apple would be better served removing any and all of the outdated packages, not even bothering to include certain packages at all, and just telling people to use one of the several community supported package managers. The GNU packages especially come to mind since they are all out of date anyway. I might be unlucky but Linux had been a nightmare on the few laptops I've tried it on.
It is not that I have not tried. I did. Linux and the multiple variants have been horrible. But you are ignoring the fact that installing software on Linux is still a mess of resolving all the dependencies and the package manager. Where in OS X, it is just a file move to App. Nowadays, I run dev code on a Linux docker container. I do pretty much anything else on OS X. Your experience running Linux on a laptop is highly dependent on how well it supports the hardware that happens to be in your device. If you're able to research the driver support ahead of time, you may find that Linux runs flawlessly on the laptop you bought.
If you install Linux on a random laptop, then getting wifi, audio, etc working can be hit or miss, depending on the drivers. As far as installing apps, I guess it depends on what you're trying to install. In my experience installing apps is easier on Linux than on any proprietary OS, so long as you're installing open source applications and working within the package manager and keeping your system up to date.
If you want to run proprietary software on your open source OS, then yeah it's more difficult since Linux distributions aren't really designed for it. My second mainline Linux install was a copy of RedHat on a laptop. Mind you, this was RedHat 5. Sometime in or 96, I forget.
Linux vs. Windows Internet Battle No Longer Exists Because Linux Is Winning
Several re-compiles later, I had that entire system working - all drivers for all the hardware, including the built-in modem plus sound and PCMCIA ethernet. I got lucky there. I had an Asus laptop that would only pickup wifi if I hibernated it first. I have a Dell touchscreen model that took a full day to get the touchpad to work at all as Ubuntu kept defaulting to the screen. Too afraid to do a fresh install of the current or any other distro because I can't remember how I fixed the touchpad issue.
It sounds like you were really unlucky. The last time I struggled with any of the issues mentioned by you is at least 4 or 5 years in the past. This is also highly dependent on specific experiences. I haven't had any issues with conflicting dependencies on Arch linux in the past few years. And personally I appreciate a system package manager for all software instead of having to download applications and doing drag and drop for installation.
It's not like Linux has a built-in solution to the problem, you can definitely still have issues with conflicting versions. BTW, on either platform your best bet IMO is installing all your scripting languages within some kind of version manager. I like pyenv, rbenv, and nodenv since they work exactly the same way between the three languages. I have a feeling that Apple doesn't want their users to install that kind of development environments, other runtimes, etc. Time Machine also sometimes doesn't Backup these installations correctly.
Similarly, TeX installation is hard and it's forcefully shoehorned on the installation, and you can't do anything better in the current situation. My solution is to run a Linux VM for these situations. For programming and server-like purposes, a headless minimal installation is fine.
But why does each version of the operating system have to consist of such vast changes so that people are forced to make leaps and bounds from the previous version? Is Microsoft needing to make such vast improvements because of shortcomings? My personal belief is that Microsoft has over time learned by its own mistakes, and has tweaked its own software over and over again with each new version.
If you ever have a lot of spare time, just go ahead and start doing some searches on the Internet of users with Vista, and the frustrations that go along with it. Not to mention that Vista is a huge resource hog and requires a PC that is at the most 2 years old or newer. I will get into this a little later on the minimum requirements and high hardware costs. Many companies are dragging their feet bigtime, in upgrading from Windows XP to Vista.
Other companies are seeing the light and realizing that there are better options out there such as Linux, and are migrating away from Windows completely to save themselves from being drawn under. This seems to be even more common in recent times where economic conditions are not so great. Now, to be fair I have also installed the latest and greatest version of Linux, too. Most recently, I have installed Fedora 8, 9, and most recently So, you are probably assuming that it has similar issues to Windows there it's more bloated than the previous version, has quirks to be ironed out, etc.
Well, the truth is, it doesn't. The main differences that I have noticed is that versions of software contained in each version are slightly newer with the newer releases of Fedora. Of course, each version has a newer version of the Linux kernel as well to support the latest and greatest hardware.
Performance is identical. Now, nothing is perfect, and I acknowledge that. Especially since Fedora is slated as "beta" software by its supporter, Red Hat. But, remember that even thought it's "beta" software, the entire open source community is supporting it which means that any issues will be identified resolved rather quickly. Even though Fedora is actually beta software, it is in reality enterprise grade and stable software. I will get more into my findings with Fedora 10, and some quirks that I identified and resolved later.
So, having the latest and greatest should be all good, right? Say what?? A downgrade option? This is no joke. Now, you tell me, if Dell is offering a downgrade option, there is seriously something wrong about this supposedly pretty picture of Windows Vista. Well, it turns out that Vista is not everything that Microsoft has it cracked up to be. This shouldn't come as any surprise. Now that Vista has been out for a while and more people have started using it, more are realizing that it has some downfalls behind its pretty front face.
On one hand, I feel a little bad for computer hardware vendors as they are caught in the middle between Microsoft and their customers who are using their hardware with Microsoft's software installed on them. Unfortunately, Microsoft has the better end of the bargain since Dell is responsible for supporting the computers. How often have you called Microsoft about a problem with their software for a home computer?
Probably never, because the vendor is responsible for the support. Corporate users are more prone to calling Microsoft directly instead. Thankfully, Dell does understand this and does offer some limited open source solutions and have been for quite some time. This sounds great, but unfortunately they do not run any sort of major deals on these systems like they do with the ones with Vista. More recent posts now show that Microsoft is turning its head from Vista and focusing on yet another release of Windows, now called "Windows 7".
But wasn't Vista supposed to be such a huge leap forward, and now customers who went out and purchased Vista, are now supposed to upgrade yet again and buy the even newer version? Microsoft has even admitted that Vista was a "learning experience", and that Windows 7 will make improvements where Vista lacked. Microsoft's Steve Ballmer himself said that Windows Vista is a "work in progress", which further backs up the fact that Microsoft knowingly sold what I would consider beta software, and is already looking away from Vista and on to Windows 7.
Thankfully I personally did not purchase Vista, nor do I have any plans to. The decision was made for me actually, because I do not have any hardware that is beefy enough to run Vista with ease. The very ironic thing about this is that consumers of Microsoft products had just been through this very same experience in , when Microsoft released the infamous Windows ME. Users of Windows ME will no doubt recall the huge list of problems, glitches, and strange issues with Windows ME since it was quickly rolled out and was soon shoved under the rug by Microsoft.
The subject of Windows ME could be researched in further depth, simply doing a little searching on the Internet on it, will yield a list of the issues that marked it one of the worst operating systems ever released by Microsoft. Many consumers which were forced to buy Windows ME on new PCs were then stuck with it, left without much hope other than trying to buy the best alternative at the time, Windows or to wait out until Windows XP.
And yet again, forced to buy something to replace what they already bought shortly before. I feel especially bad for the consumers that tried to stick with Windows ME and just use it as-is. This experience left a very sour taste in the mouth of faithful Microsoft consumers.
- How Windows and Chrome quietly made the year of Linux on the desktop | PCWorld.
- video downloader from any site for mac.
- How Windows and Chrome quietly made 12222 the year of Linux on the desktop.
- Browse faster and longer on computers;
- key generator for mac software.
- WINDOWS VS MAC VS LINUX: 10 FUNNY JOKES IN PICTURES.
- nomad factory bundle keygen mac.
Could Vista be yet another repeat of this disaster? It definitely has many signs the Windows ME failure, but thankfully I don't think it's nearly as serious. But it definitely proves to me that Microsoft is still learning from its mistakes of constantly switching directions with its software. Linux on the other hand moves and has been moving in ONE direction, which is forward.
No backtracking and constantly changing gears like Windows. As I just described, it seems that Microsoft almost expects its customers to follow it like lost puppies and buy buy buy. Buy Windows today, buy Windows tomorrow. I guess it's nice to be on the cutting edge, but constantly having to pay for upgrade upon upgrade is not a winning situation from the consumer's standpoint. Fortunately, with Linux, upgrades are not nearly as drastic from version to version because they are just not practical from a functionality standpoint. And, all upgrades are and always will be FREE.
Yes, loading up software with lots of bells and whistles is good for marketing, but not so much for functionality, which Microsoft seems to ignore. Linux is very modular in design, so that you can start with a basic setup, and easily expand upon that and install additional free software as you need. Yes, some Linux distributions are released one upon the other, some quite frequently after the other 6 months to a year sometimes. And yes, the upgrade process of installing one version over the top of another is not always a pretty sight Windows upgrades on top of the other are even worse.
Recall that since the Linux developers are essentially unlimited, current versions of many programs are packaged for older distributions of Linux. I can still find the latest software like Mozilla Firefox, Thunderbird, and others, that are packaged for my Red Hat Linux computers that are 7 years old. Yes, and it's pretty common to find programs such as these still released for older versions of Windows as well.
However, some proprietary Microsoft released software is only supported on the latest versions of Windows. This can be a huge headache as this forces consumers to upgrade the Windows operating system, in order to install a newer version of Microsoft software. There have been statements going around saying that Windows seems to be more bloated than its previous versions.
And this is totally true. In reality, software does need to be updated to keep up with the times. And thus, hardware follows the same path as well. However, Windows has made a huge jump in required hardware to run it, with the release of Windows Vista. To be fair about it, Linux has also grown over the years and requires more system resources than previous versions, however the jump is not even close to that of Windows when it comes to the amount of resources required.
I get into the topic of added hardware costs that are incurred because of this in a later section of this document. As they say, the proof is in the pudding. And with the next release of Windows, Windows 7, the signs of being bloated is ever present. The most notable example of this is the latest trend with netbooks. These are small and lightweight laptops, designed to be less resource intensive and therefore good for a lot of Internet programs such as web surfing, email, and basic Internet tasks. They are also less expensive than full laptops which makes them appealing for those that travel and don't want a lot of extra weight or baggage.
Most netbooks today either come with Linux or Windows XP. Some vendors have attempted to install Windows Vista which has resulted in massive failures because of the resource hungry nature of Windows Vista. Most recently it has been attempted to run Windows 7 on netbooks which has also proven to be quite a discouragement for Windows users. So, Microsoft has announced that future netbooks with Windows 7 slated to be called "Windows 7 Starter" will only allow up to 3 programs to be open and running at the same time. Time out Unfortunately, the answer to this is "no". It's true, Microsoft is limiting the number of programs with their next operating system for netbooks to 3 programs at once.
This is a true sign that even Microsoft knows that Windows is bloated. And their solution is to limit the users who are running the software, rather than try to make the operating system more efficient to the point that it does not have these limitations. This has boosted the use of Linux on netbooks which is perfectly capable of running on them, without any such limitations. Another boost to Linux on netbooks is Windows' inability to use the ARM architecture of netbooks, which is a low voltage architecture designed to save power and extend battery life.
Windows is currently reported as "incompatible", however Microsoft has hinted at the possibility of Windows 7 as being compatible. Meanwhile as most would expect, Linux is fully compatible now with the ARM architecture. I also touch on another issue very similar to this in the next section, that deals with the amount of memory RAM that Windows and Linux can use. So, back to this netbook example, we can clearly see that Windows is becoming bloated, while Linux remains to stay efficient in its resources needed to run. This adds to higher costs of running Windows when compared to Linux because more hardware is needed, which I will touch on in more depth in a later section.
This also demonstrates how users of Windows are actually limited in functionality right out of the box, in order to try and compensate for the bloated nature of Windows. These reasons and more should give Linux a boost in the netbook market, especially in when Windows XP will no longer be available on netbooks. Right now, Windows XP does run efficiently on netbooks which is the only reason Windows is on them.
It has already been proven that Windows Vista, and probably Windows 7, will either not run very well, or will have severe limitations in functionality. Luckily most desktops and laptops now come with enough resources to run the latest version of Windows. But, with that being said, since Windows consumes a higher amount of resources when compared to Linux, it leaves less room to grow in years to come when using the same hardware. My prediction is that Windows users will probably soon find their resources being maxed out on their computers, while Linux users will be able to use the same computers for many more years to come, without these issues.
Recently I came across another disturbing downfall of Windows. Let me size up the whole situation for you from the beginning. This one has to do with bit and bit architectures, on the Intel platform. Since the s, the Intel platform has been bit. This platform has worked well all of these years and is still used today. But, a few years ago Intel started releasing bit architectures.
What are the benefits? Well, mainly speed and expandability. For instance in this example, the bit Intel architecture was originally designed to use up to 4 GB of RAM memory , and that's it. So, you're sold on bit already? Well, don't move forward quite that fast. Yes, bit has its advantages that's for sure, but it's still not been around long enough so that it does have its quirks about it.
So, even today, bit is the most stable platform as far as applications and software go, because they have been developed and stabilized over a longer period of time. So, retaining bit functionality is quite important, at least for now until more software is written and tested on the bit platform.
Luckily, Intel found a way to increase the maximum memory of the bit architecture by introducing something called Physical Address Extension, or PAE. Essentially what this does is allow bit operating systems to address up to 64 GB of memory running on a native bit architecture. This is a big deal. This allows users to continue using bit computers with bit operating systems and programs, yet take advantage of more memory and surpass the infamous 4 GB limit.
If the motherboard in the computer supports it, up to 64 GB of memory can be used from a bit operating system. This is where I will point out a serious flaw in Windows XP and Windows Vista, that has been lingering around for quite some time and is even still present today in the latest version of Windows. Essentially what this means is that Microsoft for whatever reason, has never gotten these versions of Windows to correctly use PAE that Intel designed to allow more memory to be used.
The worst part from Windows' perspective is that developers of the Linux kernel successfully implemented PAE, and Linux can use up to 64 GB of memory without any hassle. Again, Microsoft has fallen short on implementing functionality in its Windows operating systems. For some reason though, Microsoft has successfully gotten PAE working in some of its server operating systems. While this partially makes sense since servers generally can require quite a bit of memory compared to desktop computers, there are many times that desktop computers should be able to use more than 4 GB of memory.
Many desktop models today come with 4, 6, 8 GB or more. Take for example the need to use more robust applications like AutoCAD, Solidworks, virtual machines, and other intensive applications. In this case, using more than 4 GB is essential. If you want to use Windows, you are forced to use the bit version of the operating system in order to successfully use more than 4 GB for desktop computers.
As I mentioned already, running the bit version can be very painful. The bit operating system can run bit applications, but device drivers for Windows must be bit; bit drivers cannot be used with a bit operating system. This can cause a lot of problems, not only for reliability since bit versions of drivers are usually not as widely used, but that they also require the vendor to write a bit version.
Essentially, at this point you are at the mercy of the vendor to write the driver, unless Microsoft decides to write the driver itself, but there is no guarantee. With Linux, you do not run into this issue since the drivers are almost always written by the developers that work on the Linux kernel. This means that many older devices have drivers available for a bit Linux kernel, which greatly broadens the backwards compatibility. However, with the bit Linux kernel, it is not used as widely so you can still run into similar issues where drivers and software may not be as reliable as the bit versions.
However from my experience, using the bit Linux kernel has a high success rate. The chart below shows the memory limit for various popular operating systems today, mainly Windows, Mac OS X, and Linux. Since the Linux kernel doesn't have multiple editions, it simply and fully supports PAE, whether it is on a server or a workstation.
This is a high success story for Linux, and a confusing and sorry story for Windows. We've all heard of the many issues and problems with Microsoft's latest version of Windows, Windows Vista. I've already pointed out many and will point out more as we go. The issues of Windows Vista actually go into much more depth from the basic issue of Microsoft releasing what seems like beta software, that is almost as if it is unfinished but yet pushed out the door.
Well, read some more on Vista and you will soon find out there's a huge amount of information about conspiracies within Microsoft about how Vista was released too soon, before many loopholes had been closed. There are published reports of Microsoft executives themselves admitting that there were items within Windows Vista that knowingly would not work, but the product was shipped anyway. Some well known and published facts that are pretty interesting about Vista:. The full text of this message can be found on many websites . When you realize what Microsoft has knowingly done with Vista behind closed curtains, it should prove to be quite upsetting, especially for those that have spend their own money on Windows Vista.
The evidence above is proof that Microsoft knowingly released beta quality software and charged a premium price for it. This is not right. I am sure that more similar cases could be found by simply looking around in the Internet some more. My whole point is that one minute Microsoft is hyping up its latest operating system and other software, urging its consumers to buy buy buy now. The next minute, it's admitting mistakes with this very same software, and looking to the next version to be released to fix all of its mistakes, and requiring its consumers to buy the next new version and pay for its mistakes!!
Linux is released right on time, without deadlines to meet for marketing purposes, without hype, without rushing to push a product out the door. This is a winning situation for the users of the software, as quality is put before quantity. More and more companies and individuals are finally seeing the light. Even Federal Government Agencies such as the U. Department of Transportation DOT and the National Institute of Standards and Technology NIST have openly published statements that they are disallowing Windows Vista completely from their network of 54, people, due to problems with the software and high cost.
Users have even started a petition to demand that Microsoft extend the July deadline or end of life date for Windows XP because they feel that Vista is not an option. The evidence is out there, that Vista is failing miserably and the public is realizing this and suffering from it. The fact that all of these events are taking place tells me there is just something drastically wrong here. Users should not have to revolt against the operating system's maker.
They should be using their computers for what they were meant for! This whole scenario has signs of a dictatorship which I pointed out previously in this article, where Microsoft is the dictator and controls its customers. Microsoft has had more than 20 years to learn how to release an operating system to the public's benefit, and still to this day cannot do so without causing turmoil. This has prompted many to look for other options, and has given open source software and Linux a fighting chance to start spreading through to more homes and businesses to fill this void.
OK, so I have probably covered just about every angle of Windows and Linux, the advantages of Linux and open source. One of the most comical moves that I discovered in more recent times is with the latest server operating system yet from Microsoft. Microsoft Server is looking the most like a Linux operating system than ever. If you find time to read up on Windows Server , note the many "new" features that they boast about. A powerful command-line shell. Hm, it seems to me that Linux was built on a powerful command-line shell, no?
And how about "server roles with only the necessary components and subsystems without a graphical user interface" taken directly from the Microsoft product information pages for Server Is this Windows we are talking about, without a graphical user interface? Is Windows NOT supposed to be a graphical user interface anymore? Check out more products such as Exchange , which Microsoft has actually taken out functionality from the graphical user interface and included it in the command line interface. It might be just me, but it looks as though Microsoft is realizing that having an operating system that is primarily based on a simple design with an optional graphical interface isn't such a bad idea.
Funny that this is Linux definted to a T, and has been for many years since the earliest of its life. This can account for the high success of Linux in the server world. It might be a little overdramatic, but I am seeing more and more of Microsoft picking up Linux ideas and adopting them, yet bashing Linux at every chance they can get to try and improve their sales. If you read more on articles of Windows vs.
Linux, you will notice plenty of articles hosted on Microsoft's own website. But, count how many you see on other hosted websites, I am guessing that number won't be too high. In more recent times, Microsoft started releasing products under the Common Public License an open source type of license invented by IBM. In , Microsoft founded its Shared Source program  and started to allow the source code to be distributed for a select few products. Microsoft's own website quotes: "The principles behind Microsoft's Shared Source philosophy include empowering customers and developers to be more successful, improving feedback to continually improve software Hm, this sounds vaguely famliar, doesn't it?
So, even Microsoft which over the years has been completely closed source, has started to open its doors, although so far it's been only opened a crack. Most of Microsoft's products are still completely closed source. It will be interesting in the future to see Microsoft's stance on open source, especially with its own products.
I think it is definitely evident that Microsoft is feeling the pressure from the open source community because it is starting to adopt some of the concepts of open source that have been out there for years. Next Section : Maintenance Headache. Click Here to Continue reading on making the actual migration. The Microsoft Shared Source program at microsoft. The Microsoft Update website displaying an error probably because of a heavy load on the servers which cannot handle the request. An Internet joke that has circulated since Windows 95, depicting the everyday faults and quirks of Windows.
We all remember the things in life that stand out the most, right? I recall once where I was checking out at a local supermarket, back in or thereabouts. Back around this time, the self checkouts were beginning to become quite popular. The supermarket I was shopping at had recently installed several self checkout machines. One day as I was checking out, I saw that a technician was busy trying to troubleshoot one of the machines.
When I glanced over I immediately noticed a familar screen, the infamous blue screen of death, right on the touch screen of the machine. I chuckled to myself, as I immediately recognized that it was running Windows NT 4. I haven't seen this happen since, but I thought this was pretty amusing that the company that implemented those machines used Windows as their platform of choice.
About a year later they did replace the machines with a completely different model. A new and more stable operating system was used, perhaps? I haven't seen any of these crash since then, so I am not sure what operating system they use now. For all of us who feel only the deepest love and affection for the way computers have enhanced our lives, read on. I love the next one!!! Unfortunately, I have first hand experience of trying to use a newer version of Microsoft software on an older Microsoft Windows operating system to back up the point just mentioned above. A year or so ago I attempted to upgrade Microsoft Money to Money , since had suddenly stopped working for doing online price updates.
No errors came up, so I just assumed it was because the software was a few years old and online updates were no longer supported. Am I saying that I cannot install Money on Windows ? This is unbelievable, or is it? Well, that is still a mystery to me, which I will probably never find the answer to. One would think Windows is Windows, and a Windows application should install on pretty much any version of Windows. Money will flat out not install on Windows or any version earlier than that. Unfortunately, upgrading from Windows to Windows XP was not an option in this case.
So, it was decided to steer away from Microsoft Money and look at a 3rd party product called Quicken, and buy yet another program which thankfully still supports Windows Unfortunately, Microsoft lost one more customer since I had been a user of Money since Money They finally gave me a good reason to look elsewhere. A Little Politics 2A. Maintenance Headache of Windows 3A.
Dependencies,Compatibility 3B. Licensing 3D. Reliability,Anomalies,Stability 3F. Troubleshooting 4. A Matter of Cost? How About Standards?