I’ve never done any sort of home networking or self-hosting of any kind but thanks to Jellyfin and Mastodon I’ve become interested in the idea. As I understand it, physical servers (“bare metal” correct?) are PCs intended for data storing and hosting services instead of being used as a daily driver like my desktop. From my (admittedly) limited research, dedicated servers are a bit expensive. However, it seems that you can convert an old PC and even laptop into a server (examples here and here). But should I use that or are there dedicated servers at “affordable” price points. Since is this is first experience with self-hosting, which would be a better route to take?
It is a fantastic idea to start your home server project on some e-waste hardware, and use it until you know specifically what features you’re lacking that you would need better hardware for.
If you already have one, it’s a good place to start. However, power efficiency will be the biggest drawback. Power ain’t free, and in some places it is very expensive. I’d recommend picking up some cheap ThirdReality switches and using them to monitor power consumption in Home Assistant.
Heck yeah! Old desktops or laptops are how most of us got started.
Things to consider:
- Power- this will be on 24/7 probably. That adds up
- Speed- not just CPU, but RAM, disk access and network interface can limit how much data you want to move.
- Noise- fans can suck (pun intended). Laptops tend to run quieter
I’m sort of looking to upgrade and N100 or N150’s are looking good. Jellyfin can do transcoding so that takes a little grunt. This box would work well for me. It’s not a storage solution, but can run docker and a handful of services.
While laptop batteries may not have aged well, especially if they’re left discharged, one other nice perk is that laptops effectively have an integrated UPS.
Some laptops (Thinkpads in particular) are capable of limiting the battery level via a Linux application called
tlp
so it doesn’t go pop when plugged in 24/7.
I wanted to echo this by saying that my lab stated as 4 bay Qnap NAS and evolved into repurposed consumer hardware as my interests and needs changed. My current server is an Optiplex that I bought for being small, quiet, and hanging lots of cores and my NAS is just my old gaming PC build with an HBA card (for extra SATA lanes) stuffed into a fancy case. A server is any computer that you say is a server (ideally one with functional network connectivity).
adding on to Noise, if you do end up in a situation where you’re considering buying refurbished enterprise hard disks, know that they are louder than normal consumer drives, esp if you have 4 of them running at once in a NAS
I’ve been running a plex server on an old desktop bought in 2016. Mostly streaming movies and tv shows to my family. I have a 2 TB SSD and a spare 2TB HDD. I was thinking about getting a mini PC to swap out the larger desktop. Could I get a larg HDD and ad it in an enclosure to the Mini PC to handle the media volume?
I love the vibe in this thread/community. You all seem like real cool cats. I appreciate that.
It really depends on what you’re trying to do. At the end of the day, the foundational components are pretty standard across the board. All machines have a CPU, motherboard, storage mechanism, etc. Oftentimes those actual servers have a form factor better suited for rack mounting. They often have more powerful components.
But at the end of the day, the difference isn’t as striking as most people not aware of this stuff think.
I’d say considering this is your first experience, you should start with converting an old PC due to the lower price point, and then expand as needed. You’ll learn a lot and get a lot of experience from starting there.
A couple years ago my in-laws were downsizing after retiring and they asked if I would possibly have any use for their ancient desktop PC (at least old enough to have shipped with Windows 7).
I installed Debian on it and it’s running Jellyfin, qBittorrent through Gluetun, Calibre-web, NextCloud, and Pi-Hole containers, with plenty of room to spare. I’ve also got some services running on Raspberry pis (back when they were cheap). And an external 4TB hard drive connected to it acting as a NAS. No hardware transcoding or 4K video on Jellyfin but that’s no big deal for me.
All that to say yes, you can absolutely self-host on repurposed hardware. Any old PC you’re looking at is no doubt newer than mine.
Any normal computer can become a “server”, its all based on the software.
Most enterprise server hardware is expensive because its designed around demanding workloads where uptime and redundancy is important. For a goober wanting to start a Minecraft and Jellyfin server, any old PC will work.
For home labbers office PC’s is the best way to do it. I have two machines right now that are repurposed office machines. They usually work well as office machines generally focus on having a decent CPU and plenty of memory without wasting money on a high end GPU, and can be had used for very cheap (or even free if you make friends that work in IT). And unless you’re running a lot of game servers or want a 4k streaming box, even a mediocre PC from 2012 is powerful enough to do a lot of stuff on.Totally agree, I’ll add that I run jellyfin, the *arrs, an admittedly low throughout ripping/encoding setup, and a few other containers on a single optiplex micro 7060 and there’s a lot of room leftover. I very much appreciate the laptop processor in it because it usually sits idle for 16 hours a day.
Yea definitely. I started tinkering with my first server in 2020 and used an ewaste dell tower with an i7 3770 (8 years old at that point) and an old rx460 I had laying around. As others mentioned power consumption was way worse than modern hardware. But I had at one point a half dozen people streaming jellyfin 1080 content from it with no hiccups at all. That said I was running on linux, not sure how it would do if you run windows.
Right now I’m using a low power pc to run my server, again an old ewaste dell micro pc with a 5th or 6th gen i5 and no dedicated gpu. Still no problem streaming to my partners and I’s phone/tablet simultaneously. Again, running linux.
I bought a used m920q for this reason, still working on it, I’m at the docker-compose phase
Those are beasts! My homelab has three of them in a Proxmox cluster. I love that for not a ton of extra money you can throw in a PCIe expansion slot and the power consumption for all three is less than my second hand Dell Tower server.
Do you have any good resources I can look at to see if a cluster is something I should look into?
Not really, but I can give you my reasons for doing so. Know that you’ll need some shared storage (NFS, CIFS, etc) to take full advantage of the cluster.
- Zero downtime for patching. Taking systems offline to update Proxmox sucks, especially if the upgrade fails for some reason. A cluster means I can evacuate one host, upgrade it, and move on to the next with no downtime for the hosted VMs.
- Critical service resiliency. I have a couple of critical systems in my home lab that, if they unexpectedly go down, will make for a very bad day. For instance, my entire home network (and lab) is configured to use a PowerDNS cluster for DNS. I can put the master PowerDNS server on one host and the slave on a second host - if I have a hardware failure, I won’t lose DNS. I have a similar setup for my Kubernetes cluster’s worker nodes.
- Experimentation. A cluster gives me a larger shared pool of CPU/Memory than my single host could offer. This means I can spin up new VMs, LXC containers, etc and just play with new software and services. Heck that’s how I got started with my Kubernetes cluster - I had some spare capacity so I found a blog post that talked about Kubes on LXC containers and I spun it up.
I hope that helps give some reasons for doing a cluster, and apologies for not replying immediately. I’m happy to share more about my homelab/answer other questions about my setup.
Heck yeah. Not always the best for power efficiency though.
Old laptops also a great choice but I really recommend removing the battery first.
Why removing the battery? I was thinking that could be one good thing about using a laptop is that in a way it has its own UPS.
Because as a headless server it’s likely to sit hidden for a long time. This and the always being plugged in is not good for lithium-ion batteries. If/when it starts ballooning will you notice? It’s a fire risk.
UPSes use typically lead-aced batteries like a car.
I should have thought of that. Thanks! Ironically, I have a very old lead-acid UPS in the basement that I’ve been kind of afraid to plug in again after all this time.
You can typically replace the battery inside the UPS (and should every few years). Looking at $40-50USD for “official” replacements, less for questionable third party ones.
I just got a great Jellyfin+*arr setup running off of an old PC. Let me know if you need a hand
There’s no right way, really. You can turn almost anything into a server.
If you have old hardware laying around I suggest you start with that. When you’re comfortable with setting everything up and using it on your day to day, then it’s time to invest into hardware.
I think that’s preferable. I have resused my old gaming computer as a server since I stopped gaming for a while.
Yes, you can easily do it.
You want to look at 2 things: 1. Noise 2. Ratio of performance / power usage.
- Noise
When your PC runs 24/7 then it might be annoying to hear it’s noise sometimes. Real server cases are usually even much louder than former PC’s because they are built for super strong air flow inside.
Think carefully what you need. In my situation it is just one light wooden door away from my bed, so I wanted it impossible to hear. I optimized it so, and it ended up being so quiet that I cannot hear any fans, but I hear the clicking of the harddisks all the time. Well, I got used to that, mostly. For my next home server I want to build my own case that absolutely blocks this noise.
- Ratio of performance / power usage
People are frequently asking what if I turn this old Pentium etc. into a server?
Well, these old CPU’s have very low performance compared to new ones, but it might just be sufficient. But then you recognize that the old veterans burn 100 Watts for the same performance where a modern (low performance) CPU burns only 5 Watts, and now it will do that 24/7. Think about your yearly costs. Many times it turns out that buying a new one saves your money very easily.
My answer would basically be yes, but. An old desktop (or even laptop) can definitely be used and will run fine. It should be very easy to get one for free or very cheap as companies will typically write them off after 3-5 years.
However, you might want to consider power consumption. Running a desktop 24/7 will use a lot more power than a new MiniPC or a NUC, so you may want to calculate how much it’ll cost to run a desktop 24/7 compared to a device that only uses 5W or whatever, and see whether the upfront savings make up for what you’ll pay in electricity over a certain period.
I think you might actually want to look into second hand MiniPCs unless you absolutely need to fit a bunch of hard drives in a case (like you probably would fit Jellyfin).
That depends. A lot of the power consumption comes from spinning media. Even very old desktop Intel chips have CPU throttling and consume very little while idle. Corporate desktops, even old ones, are usually quite economical.