Home Blog Page 4

LincStation N1 All-SSD NAS by LincPlus, Unboxing & Review

Today we’re going to be taking a look at the LincStation N1, a 6-bay all-SSD NAS by LincPlus. It is marketed as being great for a home or small office, being silent, powerful and easy-to-use. Let’s find out if it stands up to these claims.

Here’s my video review of the LincStation N1, read on for my written review;

Where To Buy The LincStation N1

Unboxing and First Look

The LincStation N1 comes in a fairly minimalistic black branded box.

LincStation-N1-In-Box

Inside the box, the NAS is on the top in a white protective sleeve.

LincStation-N1-Top-View

Underneath that is the Software License key card and a user manual, and in the bottom compartment is a screwdriver along with screws for the drives, as well as a 60W power adaptor and mains cable.

Unlike most NAS products, this one doesn’t come with an Ethernet cable.

LincStation-N1-Unboxed

Taking a look around the LincStation N1. The enclosure is a 2-part design with the top being matt black plastic with a LincPlus logo on it and the bottom being a single piece of aluminium with turned up edges. The top and two sides don’t have any ports on them.

LincStation-N1-Side

At the front we’ve got a power button which doubles up as an LED power indicator on the right.

Front-Panel-and-Power-Button

Underneath the power button is a this flip down cover, and beneath that is a single USB type C port and two SATA drive bays.

Front-Flip-Down-SATA-Drive-Bays

There is also a row of seven drive and network activity LEDs above the USB C port.

Underneath the flip-down cover is an LED strip. This looks like an RGB strip as it includes other colours when booting up but then just pulses in a dark blue while the NAS is running.

LED-Light-Strip

At the back, we’ve got a 3.5mm audio port, which is a bit of a strange addition for a NAS, alongside it is an HDMI 2.0 port, then we’ve got two USB 3.2 gen. 2 ports, a 2.5G Ethernet port and a 12V power input.

Underneath the ports are some ventilation holes for the internal fan.

Rear-Ports

The bottom has two rubber strips on it that act as feet, and has two drive bay covers for the NVMe drives.

NVMe-Drive-Bay-Covers

Size wise, being an all SSD NAS, it is quite compact. It measures 21cm long, 15cm wide and 3.5cm high. They advertise the LincStation N1 as being the same footprint as an A5 sheet of paper.

Thickness-of-LincStation-N1

I quite like the look of this NAS, it looks like something you wouldn’t mind having visible on a desk rather than hidden away in a rack or behind closed doors.

LincStation-N1-Front-View

Drive Bays and Specifications

The six drive bays are a little different to most other NAS’s I’ve seen, which either have 2.5” SATA bays or M.2 bays for NVMe drives. This NAS has a combination; two 2.5” SATA drive bays that can be swapped out at the front, and then four M.2 bays for NVMe drives that are installed though covers underneath it. Through these six bays you can install up to 48TB of storage.

I guess technically you could use the SATA bays for mechanical drives if you wanted cheaper storage capacity, but that would sort of defeat the object of an all flash storage NAS.

The LincStation N1 powered by an Intel N5105 CPU. This is a 4-core CPU running at 2.0GHz. While this means that it’s not as powerful as a NAS like the F8 SSD Plus that I recently reviewed, it costs less than half the price. It is currently available for $399 from their web store or on Amazon.

LincStation-N1-Install-Drives

It’s got 16GB of DDR4 RAM and 128GB of flash storage. The RAM is soldered to the motherboard, so is non-removable, but it’s the maximum that the CPU supports in any case.

The NAS also has Bluetooth 5.2 and WiFi 6 although we’ll talk about the limitations of these a bit later when we look at software.

Installing SSDs Into The LincStation N1’s Drive Bays

Now let’s install some drives into the LincStation N1’s drive bays so that we can test its performance.

I’m populating the two 2.5” drive bays with Crucial BX500 drives and the four M.2 drive bays with Crucial P3 Plus drives.

These are just for testing, you should use NAS-grade drives if you’re going to be using them in a NAS long term.

The 2.5” drives are mounted into the trays using the included screws and the trays then slide into place like traditional NAS drive bays.

The M.2 bays are PCIe gen. 3 x 1, so a fairly slow interface by today’s standards. This is likely constrained by the CPU, but the speed would easily saturate the 2.5G network connection in any case. This limitation is something to keep in mind though, as you could save yourself some money by going for older and slower drives without affecting overall performance.

P3-Plus-NVMe-Drive-Being-Installed

The bays only support 2280-size drives and have tool-less clasps that hold each drive into place. I really like this feature, it makes installing the drives a breeze.

Four-Crucial-P3-Plus-Drives-Installed

The covers also act as heatsinks for the drives so there is a large thermal pad on each. You’ll also need to remember to remove the protective film from each pad before replacing the covers.

Thermal-Pad-On-NVMe-Drive-Bay-Door

Now we just need to plug in a network cable and power cable, then press the power button to boot it up.

LincStation-N1-Booted-Up

Setting Up Unraid For The First Time

Unlike a lot of other NAS manufacturers, LincPlus haven’t tried to develop their software for the LincStation N1, but have rather shipped it out with an included Unraid License.

Included-Unraid-License

I think this is a good choice. A product can have the greatest hardware and be let down quite significantly by its software. We’ve seen this over and over with SBCs that try to compete with the Raspberry Pi. Unraid has been around for a while and is a reputable software package for NAS setups.

When you first boot it up, you’ll need to enter the included Unraid license key to activate it or you can run a trial of the software for 30 days before activating it.

Unraid-First-Boot-License-Setup

The Unraid installation is about as clean as it gets. LincPlus haven’t preconfigured anything or installed any supplementary applications. This is probably a pro for an experienced user but may be a con if you’re new to Unraid and are still finding your way around.

Unraid-Dashboard

You’ll then need to assign your disks, start your array and then create a network share.

Drive-Array-Allocation

Unraid also allows you to install plugins, configure and run Docker Containers, create and run Virtual Machines and they have an App Store with over 2000 apps available to install to add functionality to your NAS.

There are some downsides and limitations to Unraid in the N1. The standard version of Unraid supplied with the N1 doesn’t support the available WiFi and Bluetooth features, so you can’t use either of these as part of the core NAS functionality. You can however use them through a virtual machine.

LincStation-N1-Running-LED-Strip

The more significant issue is that Unraid is focused around physical mechanical drives and hasn’t really been optimised to work with SSDs. There is a risk with some SSDs that they may do on device data management which would damage the parity data. Unraid’s recommendation is to use SSDs as cache drives and mechanical drives for the array, which is obviously not the purpose of the N1.

Testing The Performance Of The LincStation N1

On the product page, LincPlus claim transfer speeds of up to 800MB/s. This is likely referring to the maximum read speed that can be achieved from a single NVMe drive. While this is important in some respects, you’re not going to get anywhere near this speed over the 2.5G network connection. You’d be lucky to get real-world results of around 280MB/s.

You could potentially improve the network speed by adding a second 2.5G ethernet adaptor to one of the USB ports to take advantage of link aggregation but this is just a limitation of the NAS to keep in mind. Most people aren’t running more than 2.5G in their homes in any case.

Testing transfer speeds with a small 256 MB file I got average writes a little over 250 MB/s and reads of a little under 270MB/s.

With a larger 1 GB file I got average writes of a little under 250MB/s and reads a little under 260MB/s.

And with a very large 64GB file I got average writes around 240MB/s and average reads a little over 250MB/s.

AJA-System-Test-256MB

So quite consistent results across the three file sizes. Reads come fairly close to saturating the 2.5G network connection and writes are just a little slower.

I then tried running a real-world file transfer test in Windows 11, transferring a large 70G video file.

Writing to the NAS we get a very consistent write speed of a little over 280MB/s and reading from the NAS we get a similarly consistent but slightly slower average transfer speed of about 255MB/s.

Windows-11-Realworld-Transfer-Test

Noise and Power Consumption

In agreement with LincPlus’ claims, the LincStation N1 is almost silent. It obviously doesn’t have any mechanical drives in it, which eliminates drive noise. If there is nothing else running in the room, you can faintly hear a fan running when it is on but it is impressively quiet. I also couldn’t hear or measure any difference between the fan noise at idle and when reading or writing to the drives.

Fan-Noise-At-Idle

Under a full CPU load the fan is a bit more audible but it’s still pretty quiet. You can’t hear it from more than 2 meters away in a quiet room.

Fan-Noise-Under-Load

Power consumption on the N1 is also great. It uses just 11W at idle and this only goes up to around 17-18W when reading or writing to the drives.

Final Thoughts On The LincStation N1

Overall I think the LincStation N1 is quite a nice entry-level all-flash storage NAS package. It has a well-balanced set of features and although it is limited by the single 2.5G Ethernet connection, this is likely good enough for most home or small office use cases. I think their decision to include an Unraid license rather than developing their own software is a really good one and you’ve obviously still got the flexibility to go with a different OS if you’d like to.

Let me know what you think of the LincStation N1 in the comments section below.

incStation-N1-Running-With-Lights

I Tried 3 New Hats For The Raspberry Pi 5

Today we’ve got three new hats for the Raspberry Pi 5 that we will be trying out. I’ve used variants of each of these on my Pi 4 setups for a few years now and I think these are the most handy hats to keep around for my Pi projects.

3-New-Hats-For-The-Raspberry-Pi-5

Here’s my video trying out and testing these Pi 5 hats, read on for the write-up;

Where To Buy These Pi 5 Hats

Tool & Equipment Used:

X1003 NVMe Hat

The first hat we’re going to be looking at is the X1003 NVMe hat, which you can pick up online for around $15.

I tested three other NVMe hats a few months ago, but this was one that I was told about after doing those tests.

I primarily like that it is compact and simple. This leaves a lot of room around the hat for cooling. The main drawback with the others that I’ve tested is that the bottom mount ones tend to enclose the drive and cause the drive to run hot and the top ones tend to restrict airflow to the cooler.

This hat keeps both the drive and fan open.

There is one strange design choice and that is that it requires an active cooler to be installed on the Pi because it is supported along its edge by picking up on one of the active cooler’s screw holes. I don’t know why they went with this approach rather than picking up on one of the Pi’s mounting holes with a standoff. I assume they wanted to keep those free.

Hat-Is-Mounted-With-Single-Screw

In terms of drive options, you can only use 2230 and 2242 size drives in it. Like the other NVMe hats that I’ve tried, it connects to the Pi with an FPC cable and gets additional power from the GPIO pins.

I tested this hat using James Chambers Pi benchmarks script and a Sabrent Rocket NVMe SSD.

I got an average score over 3 tests of 42,457. This is a good result given that I’ve left the Pi’s PCIe port running at the default Gen 2 speed. It is comparable to this drive running at Gen 2 speeds in the other hats that I’ve tested.

James-Chambers-Pi-Benchmark-Script-Score

The hat will also fit into the official Pi 5 case with the fan plate removed and the active cooler then takes care of keeping the Pi’s CPU cool.

X1003-Fits-Into-Official-Pi-5-Case

PoE+ & NVMe Hat by HackerGadgets

The next hat that I tested is an NVMe and PoE+ Hat by HackerGadgets which you can get online for $43.

This is a power over ethernet or PoE hat designed for the Pi 5 that isn’t much larger than the official PoE hat for the Pi 4 but also includes an M.2 port for an NVMe drive that makes use of the Pi 5’s PCIe port.

M.2-Port-For-NVMe-Drive

This hat allows you to make use of a PoE switch or PoE injector to power your Pi using the connected network cable rather than having to use a separate power adaptor. It simplifies your setup and allows you to use fewer cables. If you’re running a headless arrangement like you would on a NAS or Pi Hole project then you only need an Ethernet cable and your Pi is good to go.

It’s PoE+ or PoE type 2, which means it can handle a higher power delivery than the original PoE standard. Typically up to 25W instead of the original 12.5W.

Switch-Mode-Power-Circuitry

This hat’s switch mode power supply can supply 5V at up to 4.8A, so up to 24W, and it’s got an additional USB C power port on the back that you can use to power or charge an external device.

USB-Type-C-Port-On-Back

Its also got a cutout in the PCB that you can either mount a 30mm fan to or install over the active cooler and it’ll still allow airflow through to the fan.

Hat-Still-Provides-Path-For-Airflow-To-Cooler

Like the previous hat, this one is also compatible with 2230 and 2242 size drives.

The hat plugs into the GPIO pins to supply power to the Pi and it’s got an FPC cable to connect the NVMe drive to the Pi’s PCIe port. It’s also got a 4-pin connector at the front that picks up power from the Ethernet port.

PCIe-Ribbon-Cable-To-Connect-Drive

With the hat installed, we don’t need to hook up a USB C power cable to the Pi as it’ll get power from the Ethernet connection.

USB-C-Power-Cable-No-Longer-Required

Upon booting up I got a warning that the supply is not capable of supplying 5A. This comes up because the Pi isn’t able to carry out the power delivery negotiation with the supply while booting up because the hat is powering it through the GPIO pins.

You can disable this check in firmware by adding the below line to the boot.conf file;

PSU_MAX_CURRENT=5000

To test this hat, I ran two different tests, one on the NVMe drive and one on the CPU.

I again tested the NVMe drive using James Chambers Pi benchmarks script and a Sabrent Rocket NVMe SSD. I got slightly better results than with the previous hat with an average score over three tests of 43,550, again at gen 2 speeds.

Running-Pi-Benchmark-Script-To-Test-Drive-Speed

I also tested it by running CPU burn to load up the CPU to make sure that the PoE power supply was capable of handling a full CPU load. It ran for 10 minutes without any issues and also without thermal throttling. PoE hats tend to run quite hot but this doesn’t seem to have a significant effect on the Pi.

This hat is also compatible with the official Pi 5 case.

PoE-and-NVMe-Hat-Fit-Into-Official-Pi-5-Case

X1200 UPS Hat

The last hat is very similar to one that I used in my Pi 4 server build a while back and that’s an X1200 UPS hat which you can get for $42.

This hat stores power in two 18650 cells on the bottom and provides uninterruptable power to the Pi 5 through the Pi’s GPIO and power circuit pins.

Also on the bottom are some additional 5V power supply connectors and a power button.

Additional-5V-Power-Connectors

On the top we’ve got a USB type C power input which takes the Pi 5’s official 5V 5A power supply. We’ve also got a charging LED and battery voltage indicator. There are two more status LEDs on the other edge. The Pi 5 one tells you when the Pi 5 powered on and the one alongside it tells you when the hat is on.

Battery-Level-Indicator-and-Charging-LED

The hat uses some cleverly designed pogo pins to make contact with the Pi’s power and GPIO pins from underneath the Pi. With this hat on the bottom, you’ve still got space on top of the Pi to put an active cooler and a top mounted NVMe hat.

I’m going to add the X1003 NVMe hat to the setup as my boot drive.

Once installed, the Pi can boot up from the batteries if they’re charged and you then provide power to the UPS rather than to the Pi to keep them charged.

There is a script that you can install that’ll provide UPS stats to the Pi via the I2C interface. This will also tell the Pi to safely shutdown if the battery voltage drops below a configurable level.

When mains power is removed, the UPS continues to provide power to the Pi, like in the beginning when it was running off the batteries only. The length of time that it can run will obviously depend on the capacity of the batteries and what you’ve got connected to your Pi but you should be able to get an hour or two from a good set of 3000mAh batteries.

Runs-With-Power-Disconnected

This hat is too big for the official case but there are a couple of other case options available for it, including one of my own designs.

This-Hat-Is-Too-Thick-For-The-Official-Pi-5-Case

Let me know in the comments section if you’ve used any of these hats for your own projects or if there are any other hats for the Pi 5 that you’d like me to try out.

Beelink EX Docking Station For The GTi 14 Ultra

A couple of weeks ago, I tested the Beelink GTi 14 Ultra, a powerful mini PC with a full-size PCIe port underneath it. At the time Beelink said that this was for a dock that they were working on, which would allow an external GPU to be used with the PC, but it hadn’t launched yet. So I tested the interface using my own 3D printed adaptor that screwed onto the bottom of the pc along with a PCIe riser and an external power supply.

External GPU Plugged In

Beelink have now completed the dock, so that’s what Im going to be sharing with you in today’s video.

This is the EX Docking Station and it is compatible with the GTi14 and GTi12 series PCs at the time of writing this post.

Beelink-EX-Docking-Station

Here’s my video of the EX Docking Station, read on for my write-up;

Where To Buy The Beelink EX Docking Station?

Tools & Equipment Used

Unboxing & First Look

The Beelink EX Docking Station comes in a white branded-sleeved box that is a lot bigger and heavier than I was expecting it to be.

EX-Docking-Station-Unboxing

Inside the box you get some manuals, the EX Docking Station, 8-pin power connector cables, a power supply cable, brackets to hold the PC and GPU in place and then a pack of screws and a PCB that looks like an M.2 adaptor for a wifi module.

Beelink-EX-Docking-Station-Unboxed

The EX Docking Station measures 179mm wide, 65mm high and 225mm long.

EX-Docking-Station-Size

This is a bit more than just a PCIe adaptor or riser, Beelink have integrated a few other features which make it versatile and really easy to use.

First up is the obvious main feature, the PCIe x8 port which allows you to connect an external GPU to the PC.

PCIe-Slot-For-GPU

Alongside that are two 8-pin power supply ports for the GPU. These are fed from an internal 600W power supply.

Two-8-Pin-Power-Connectors

There is also a separate port to power an external fan if your GPU requires this.

Next to those is a USB2.0 port. I think this is a bit of a strange addition, it would have been a lot more useful if it were a USB 3.0 port or USB type C port but I guess you could use this for a keyboard or mouse dongle.

EX-Docking-Station-Overview

Then there are two antenna ports that you can use if you add an internal WiFi adaptor.

At the front is the power button and a power indicator LED alongside it.

EX-Docking-Station-Power-Button

At the back, we’ve just got the power supply input and some ventilation holes.

Power-Input-and-Ventilation-Holes

What GPUs Can You Plug Into The EX Docking Station?

The open design and relatively powerful integrated power supply mean that you can use a high-end GPU with the docking station.

EX-Docking-Station-Side

Beelink claim that you can run a GeForce RTX 4090 in it. I don’t have a 4090 to test this claim, but it seems like they have sized the power supply at 600W specifically to make sure that high-end cards like these can be used since the 4090 requires around 450W. As shown earlier, the dock only has two 8-pin power ports available though and some cards like the 4090 require 4. So you may need to use additional adaptors to supply power to your card.

EX-Docking-Station-Back

Another thing to keep in mind is that the dock is designed for 2-slot cards, so you’ll probably need to remove or not use the retaining bracket to use a 3-slot card like the 4090 but there looks like there is enough room for it.

GPU-Support-Bracket

Internally, the EX Docking Station also has another PCIe 4.0 x 1 expansion port in the form of an M.2 port. You can use this port to add an NVMe SSD or use the small included adaptor board to add a WiFi module.

Installing A GPU On The Dock

Now that we’ve had a look at what the EX Docking Station has to offer, let’s get a GPU plugged into it and try it out. I’m going to be using the same Radeon RX 6600 GPU that I used with my 3D-printed adaptor.

Radeon-RX6600-GPU

First, we need to install the mounting bracket on the back of the dock.

Back-Bracket-Installed

The GPU plugs into the PCIe port and we just need to hook a single 8-pin power cable up to it.

We then need to remove the cover over the PCIe slot on the bottom of the GTi 14 and then slide it onto the PCIe adaptor on the dock.

A retaining plate then holds the PC in place.

PC-Retaining-Bracket

Testing The EX Docking Station

To test the EX Docking Station, I tried running two benchmarks that I ran previously on the stock GTi 14 – Geekbench GPU and Furmark.

In Geekbench we get a GPU score of 78,310, which is unsurprisingly significantly higher than the 37,460 we got on the stock setup.

Running Furmark, we get a score of 7,033, which is also much higher than the 1,920 we got with the stock setup.

To test gameplay, I opened up Counterstrike 2 and set all of the graphics settings to Very High at 1080P.

We get around 150 fps quite consistently. This is about 2.5 times what we’d get on the integrated GPU, which is already quite powerful for a mini PC.

Playing-Counterstrike-2

In Doom Eternal with all graphics settings on Ultra Nightmare and Ray Tracing turned off at 1080P, we get over 200fps. This is about 3 times faster than the integrated GPU and the RX 6600 is a pretty low-tier budget GPU.

Playing-Doom-Eternal

Can You Use Other PCIe Cards With The Dock?

You don’t have to use this docking station to add a GPU to your PC. Since it uses a standard PCIe interface, you can use it with other PCIe cards too.

Playing-Doom-Eternal

I actually used this dock to add a 10G Ethernet adaptor to the GTi 14 recently to do my testing on the Zimacube Pro and the TerraMaster F8 SSD Plus. This is obviously quite an expensive way to add 10G Ethernet to your PC, but it does give you a lot of expansion options for other cards too.

Final Thoughts On The Beelink EX Docking Station

The EX Docking Station retails for $159 on their website at the time of writing this post. You are paying a premium on what is largely a 600W power supply and a PCIe riser with some added features, but I don’t think it’s prohibitively expensive.

Being a first-gen product there are two minor drawbacks that I’d like to mention.

When you use the docking station with the GTi 14 Ultra, you still need to use a power cable for the PC as well. You’ve then got two power cables to plug in and two power buttons to press to boot the PC up. It would have been nice to have these a bit more integrated in some way.

Two-Power-Ports-On-Dock-and-PC

I also would have liked to have seen an easier way to add and remove the PC from the dock. Something like a toolless lever or snap-in lock to hold the PC into place like a laptop dock would make it far easier to remove the PC, which is kinda of the point of a dock.

PC-Retaining-Bracket

Other than those two drawbacks, I think this is quite an innovative product. Mini PCs have always been held back by their GPU performance and with this dock, you can add a high-end GPU to an already powerful mini PC to create a good gaming setup that also offers the flexibility to be very portable when you need it to be.

Let me know what you think of the dock in the comments section below.

I Tried The New TerraMaster F8 SSD Plus, Is It Any Good?

A few months ago I built an all-SSD NAS using a Raspberry Pi 5 and a quad-SATA hat from Radxa. By adding a 2.5Gb Ethernet adapter, I managed to get pretty good transfer speeds out of it. I got about 260MB/s writing files to the NAS and 200MB/s reading files from the NAS.

Assembled Pi 5 NAS Case

Following that video, TerraMaster reached out about a new all-SSD NAS that they’re launching this week. This is the new F8 SSD Plus. It’s their first NAS that is designed to be used with SSDs only and internally it’s got space for 8 M.2 NVMe drives.

There are two versions of this NAS. I am testing the higher-end F8 SSD Plus, which has an i3 processor and 16GB of RAM and will retail for $800. They also have a lower-spec F8 SSD with an N95 processor and 8GB of RAM for $600.

Here’s my video review and testing of the F8 SSD Plus, read on for my write-up;

Where To Get The F8 SSD Plus?

Tool & Equipment Used

Some of the above parts are affiliate links. By purchasing products through the above links, you’ll be supporting this channel, at no additional cost to you.

Unboxing & First Look At The F8 SSD Plus

The F8 SSD Plus comes in a black branded box that already gives you the impression that this NAS is much smaller than a traditional NAS that takes physical drives.

Included in the box is the F8 SSD Plus along with a Cat6e Ethernet cable, although not a particularly long one, a power cable for the power supply, a screwdriver and screws for installing the NVMe drives, the power supply and a set of 8 heat sinks for the drives. This is quite a nice inclusion which I wasn’t expecting to see.

TerraMaster-F8-SSD-Plus-Unboxed

First up, this NAS is quite small. It’s not much bigger than a single physical 3.5” drive that a traditional NAS would use.

TerraMaster-F8-SSD-Plus-Size

It’s a minimalistic design, which I quite like. There is nothing on the front and the two sides bear the TerraMaster logo.

On the bottom, we’ve got two 50mm PWM fans which draw air in through the bottom and exhaust out of the top of the NAS.

Fans-At-Bottom

On the back, we’ve got the barrel jack power input, an HDMI 2.1 port, a 10Gb Ethernet port, two USB 3.2 ports and a USB type C port.

F8-SSD-Plus-Ports-On-Back

On the top, we’ve got some ventilation holes and the power button.

Ventilation-and-Power-Button

They say that this is a toolless design and it certainly is to open it up, we’ve just got a single thumb screw at the back. This allows the internal chassis to slide out to access the board and drives. I wasn’t sure whether the board slides out of the top or down and out from the bottom, but out from the top is the answer.

I think calling it a toolless design is a bit of a stretch as you need to secure your storage drives with a screw and they include a screwdriver for that. There are some ways they could have made it truly toolless but this is nitpicking, it’s not difficult to install the drives.

Internals-of-F8-SSD-Plus

Internally we’ve got a single 16GB stick of DDR5 RAM running at 4800MHz. You can upgrade this to 32GB.

Installed-RAM-Stick

The CPU is under a large black heatsink at the bottom, directly above the fans. The F8 SSD Plus has an 8-core Intel i3 N305 processor with a maximum frequency of 3.8GHz. This is a 2023 chip which has got 6MB of cache and a TDP of 15W.

CPU-Heatsink

Then we’ve got our 8 M.2 NVMe ports. These are all PCIe gen 3 x 1 ports. This may sound disappointing at first glance, but each of these ports are individually capable of saturating the 10Gb network connection, so there shouldn’t be any issues with this speed. Through these 8 ports, we can connect up to 64TB of storage.

M.2-NVMe-Drive-Ports

Installing NVMe Drives In The NAS

Now that we’ve taken a look at the internals, let’s get the drives installed.

I’m using four Crucial P3 Plus drives. These are just for testing, if you’re going to be using drives in a NAS long term then you should get NAS-grade drives that have better endurance.

Crucial-P3-Plus-Drives

First, we need to fit a heatsink to each drive. They’re held in place with an included band on each end. I don’t particularly like this solution as I’m not sure how long these bands will last, but I do like that they’re fitted directly to the drive and the heatsinks look like they’re good quality.

Drives-With-Heatsinks-Mounted

Installing the drives is easy, they plug into the M.2 port and a single screw holds each drive in place.

With all four drives installed, we can slide the board back into the enclosure, plug in our power and network cable and boot it up.

Plugging-In-Power-and-Ethernet

TerraMaster’s Lastest Operating System TOS 6

By default, the F8 SSD Plus is set up to install and run Terramaster’s latest operating system called TOS 6. It’s based on Linux but they’ve given the web interface a Windows 11 look and feel.

TerraMaster-TOS-6

On the first boot, you’ll be guided through a setup process that will set up a drive pool and install TOS 6 onto the available drives. The drives are set up using their TRAID system. There is a bit to go through in understanding how this works but it’s essentially quite similar to RAID 5, providing a good balance of redundancy and storage capacity but it also allows for the flexibility to use different capacity drives within the array.

TOS has done away with a traditional dashboard-style layout with desktop icons and have instead added a taskbar along the top with little tooltips that come up to guide you around.

TOS-6-Desktop-and-Taskbar

It feels fairly intuitive to use. It’s even got a sidebar to monitor system stats, which can be modified by dragging and dropping modules, and it includes a notification bar.

TOS-6-Sidebar

I really like their drive management and backup options. You have a lot of options for local and cloud backup. You can also set up an email address to automatically send notifications to if errors with drives or processes are detected.

I also like their file management windows. Again, this interface has a Windows 11 look and feel, but that makes it intuitive. You’ve got right-click options for files and folders, you can create shares directly from this interface.

File-Manager-Window

You can even preview some files like photos directly from the file management window.

File-Manager-Photo-Preview

You can quickly search for settings or features from the settings window, so you don’t have to waste time looking through menus.

Search-For-Settings

TerraMaster have also included a nice array of apps that you can install to add functionality to your NAS.

Applications-Available-For-Install

If you can’t find what you need with these apps, you can also quite easily install docker to deploy your own containerised apps. The CPU in this NAS has a fair amount of headroom to run these, so the F8 SSD Plus will work well as a small home lab.

That’s a brief overview of TOS 6. If you don’t like their software you can also install your own operating system like TrueNAS or Unraid on the NAS if you’d prefer.

Testing The F8 SSD Plus’ File Transfer Speed

Next, let’s do some transfer speed tests. I first used AJA System Test to automatically test the transfer speeds and then I did a real-world test on Windows 11.

AJA System Test

Transferring a small 256MB file, I got fairly consistent writes a little over 1000MB/s and reads around 850MB/s.

Drive-Testing-256MB-File

Going up to a 1GB file, we get very similar results – writes a little over 1000MB/s and reads around 850MB/s.

Drive-Testing-1GB-File.

Transferring a 64GB file started off much slower than the previous two tests. Writing started off 500MB/s but then ramped up during the first half of the transfer and eventually settled at a little under 1000MB/s for the remainder of the write. Reading the 64GB file was stable but was again slower than with the smaller files. Reading remained at a bit under 750MB/s. So both reads and writes were about 100MB/s slower with the large 64GB file.

Drive-Testing-64GB-File

Windows 11 File Transfer Test

Running a real-world transfer test in Window 11, copying a large 70GB video file to the NAS, writes started off saturating the 10Gb Ethernet connection at 1.1GB/s. This dropped off quite quickly though and eventually settled at a little under 650MB/s for the remainder of the transfer.

File-Transfer-Test-To-NAS

Reading the same 70GB video file from the NAS was much faster. Reading stayed at 1.1GB/s for most of the transfer, with just a couple of short dips.

File-Transfer-Test-From-NAS

So, overall my testing proved pretty good performance for file transfers. Like with my other NAS reviews, this is straight out of the box with the default setup. I haven’t done any tweaking or optimising of settings.

Fan Noise On The F8 SSD Plus

The F8 SSD Plus is very quiet as it doesn’t emit any physical drive noise. Fan noise is also minimal, you can hardly hear the fans running when at idle. The ambient sound level in my testing room is about 32 decibels, so the fans running at low speed barely register.

Fan-Noise-Level-at-Idle

When writing to the drives or doing CPU-intensive tasks, you can hear the fans spin up but they’re not much louder. The sound level goes up to about 39-40 decibels.

Fan-Noise-Level-under-Load

So this NAS is ideal for a home or small office where you’d have the NAS in the same room that you’re working in.

F8 SSD Plus Power Consumption

Keep in mind that power consumption will obviously vary with the type and quantity of drives installed, so you may get slightly different results. With my 4-drive setup and with the CPU under no load, we get a power consumption of 14W.

Power-Consumption-At-Idle

This goes up to 35W when writing to all four drives and saturating the network connection.

Power-Consumption-Under-Load

TOS 6 does have an option to put the drives to sleep if there is no activity for a period of time and this should further decrease the idle power consumption. In their documentation, they say that this will bring power consumption down to 9W. This seems reasonable from my test results.

Limitations of the F8 SSD Plus

Most NAS products in this sort of price point would come with a secondary network connection, so it would have been nice to see a 2.5Gb network port alongside the 10Gb port as a secondary port or failover. You could add an external network adaptor to one of the available USB ports like I did with my Pi NAS, so this is not a major issue.

Each M.2 port also only supports PCIe gen 3 x 1 drive speeds. This sounds slow, but the limitation has to do with the available PCIe lanes on the processor. The i3 N305 processor has only got 9 PCIe lanes available. TerraMaster have distributed these over the 8 drives and the 10G Ethernet port, providing one lane each. I think they’ve made a fair choice here as the drives would individually saturate the 10G port in any case. So you’re not actually losing drive speed. This is worth keeping in mind when choosing drives though as you can save some money by buying older and/or slower drives that work well with the available interface.

Drives-Available-on-F8-SSD-Plus

The only other limitations I could find are that the F8 SSD Plus doesn’t have ECC memory and is missing native support for ZFS. Neither of these are particularly big concerns, they’re just worth noting.

Final Thoughts On The F8 SSD Plus

Overall I think this is a great product and I haven’t found any significant issues with it. There are a few features that would have been nice to have had included since it is quite pricey, but once you add an NVMe adaptor and 10Gb Ethernet adaptor to a mini PC of a similar size and performance, you’ll probably be around this price point too.

Let me know in the comments section below if you think I should have a go at building my own single board computer based NAS with similar functionality to see how it compares. Also, let me know what you think of the F8 SSD Plus.

I Tried The ZimaCube Pro, Is This The Perfect Home Server?

Today we’re going to be taking a look at the ZimaCube Pro. This is a new device from IceWhale, the company that have already brought us the Zimaboard and Zimablade that I’ve reviewed previously.

The Zimacube Pro is the second device from their Kickstarter crowdfunding campaign that was successfully funded in December last year. It is marketed as being a personal cloud server with easy-to-use software and has more powerful hardware than the standard Zimacube from the same campaign.

Here’s my video review of the ZimaCube Pro, read on for the written review;

Where To Buy The ZimaCube Pro

Tool & Equipment Used

Unboxing and First Look At The ZimaCube Pro

The ZimaCube Pro has a 12th generation Intel i5 processor instead of the Intel N100 processor in the standard cube. This allows for faster RAM, better PCIe expansion, faster M.2 ports and significantly better connectivity. It comes a price though, the Pro version currently retails for $1099.00, which is a significant step up of $450 from the standard version that retails for $649.00.

Included in the box are some basic tools, a Cat6a network cable, a Thunderbolt 4 cable, a power adaptor and some screws for mounting the drives into the drive bays.

Ports and Interfaces on the ZimaCube Pro

Taking a look around the Zimacube Pro. On the front, we’ve got two USB 3.0 ports, a USB type C port, a 3.5mm audio jack and the power button.

ZimaCube-Pro-Front-IO

Below that, under the ventilation grill, it’s got six SATA drive bays which can take 2.5” or 3.5” drives and a 7th tray that has four M.2 ports for NVMe drives.

The 7th tray also has customisable and programmable RGB lighting with an onboard controller that you can load custom firmware onto.

The ZimaCube Pro also has two internal M.2 ports – one of these being for the OS storage drive. So that’s a total of 6 SATA ports and 6 M.2 ports, which allow up to 164TB of connected storage.

This ventilation screen that covers the bays looks great when it’s installed but could really do with a small tab or recess on the edges to make it easier to remove.

Front-Ventilation-Panel

The sides each have ventilation holes at the top and four screws holding the top and bottom covers in place.

Side-Ventilation-Holes

On the back of the ZimaCube Pro, we’ve got a reset button, the power input, two Thunderbolt 4 ports, a 10Gb Ethernet port, two 2.5Gb Ethernet ports, two more USB 3.0 ports, a DisplayPort 1.4 and an HDMI port.

ZimaCube-Pro-Rear-IO

The Thunderbolt 4 ports allow the Zimacube to be used as a DAS (Direct Attached Storage) device. This is a feature that is not commonly found on consumer-level products, so I’m interested to see how well this works. They claim that you can get up to 2GB/s transfer speeds through this port, so we’ll definitely be testing that out!

Internally, as mentioned earlier, powering the Zimacube Pro is a 12th gen Intel Core i5 processor, it is the 1235U version with 10 cores running up to 4.4GHz. This particular one has 16GB of DDR5 RAM, but this can be expanded up to 64GB.

We’ve got dual PCIe slots, one being PCIe 4.0 x 4 and the other PCIe 3.0 x 2. These allow you to add expansion cards like a GPU, an AI acceleration card or a transcoding card to improve the Zimacube’s performance for your particular workflow.

Two-PCIe-Slots

Our 256GB M.2 NVMe OS drive is partially hidden by the cooler.

Internal-NVMe-OS-Drive

The M.2 port near the back is populated with the 10GB Ethernet adaptor and a short tail to a daughterboard with the physical port on it.

0GbE-M.2-Adaptor

One of the main complaints early users have had is with the ZimaCube Pro’s cooling solution. It is claimed to be quite loud and not very effective. IceWhale have responded to concerns by providing a free issue improved cooling solution for backers – which is supposedly quite easy to swap out. We’ll test this against the currently installed cooler to compare the results.

Replacement-CPU-Cooler

ZimaOS – The Intended Operating System

As with other IceWhale products, it is intended to be used with their own operating system. In this case, it comes preloaded with ZimaOS. This is very similar to CasaOS which is loaded onto the ZimaBoard and ZimaBlade, with a few features tailored to the ZimaCube like RAID support and remote access functionality.

You’re not locked into using their software though. You can fairly easily install other operating systems like OpenWRT, pfSense, TrueNAS or Unraid.

ZimaOS is effectively a skin for docker with a bit of additional functionality and a good support community. It’s got a range of preconfigured apps that are very easy to set up and you can configure your own apps through the web dashboard or by loading your own .yaml files.

ZimaOS-App-Store

Testing The Cooling Solution

Let’s start by testing the old and new CPU cooling solutions. I haven’t done anything to the stock cooler, I’ve left it as it arrived. Some users reported having their CPU cooler installed with the plastic protector left in place on the contact surface. To test the thermals, I’m going to first try a 50% load on the CPU and see what the does to the CPU temperature and then I’ll try a 100% load.

ZimaCube-Pro

Old Cooling Solution

The temperature is already sitting quite high at idle. We’re at 2% CPU utilisation and are already running at over 40°C in a 20°C room.

CPU-Stress-and-Temperature-Test

At 50% load, the temperature spikes to over 90°C in about 2 seconds, which is really quite poor. The CPU starts thermal throttling almost instantly, dropping the clock frequency down to 3.5GHz and then further to 3.3GHz about 45 seconds later. This brings the temperature down a bit but obviously comes at the expense of performance.

At 100% load, it’s practically useless for sustained use. The CPU temperature spikes to 100°C before the utilisation gauge can even increase and again is significantly thermally throttling, this time down to just 2.4GHz.

So this cooler is really undersized or just doesn’t work effectively – confirming other users concerns.

Thermal-Photo-of-Original-Cooler

I have to also mention that the fans for the drive bay of the Zimacube are quite loud. I haven’t installed any drives into the bays, and I’m in an air-conditioned room, and the drive bay fans already spin up occasionally. So their curves must be set very low. That’s not necessarily a bad thing and you can adjust them in the BIOS. It’s just not great if you’re going to have the ZimaCube Pro near where you’re working.

Fans-Cooling-Drive-Bays

The CPU cooler is audible but isn’t that loud when idle. When it spins up under any sort of load then it is quite load.

New Cooling Solution

Now let’s get the new cooler fitted. The cooler comes with a bracket for the underside of the motherboard but I believe it uses the same pattern and screws so we shouldn’t need to use this.

The old cooler is quite easy to remove. You just need to remove the four screws holding it in place and unplug the fan.

Original-CPU-Cooler-Removed

The new cooler has a preapplied phase change thermal pad on it so we don’t need to reapply the thermal paste. I cleaned the old thermal past off of the heat spreader to give it the best chance of success.

New-CPU-Cooler-Phase-Change-Thermal-Pad

They say that it should be installed with the fan at the back of the Zimacube, directed towards the front, which I’ve done.

New-CPU-Cooler-Installed

Now let’s boot it up and see if it performs any better.

At idle after booting up, we’re now running about 8-9°C cooler at 33-34°C.

Again starting with a 50% load, it now takes about 10 seconds to run over 90°C. This is a significant improvement over the previous cooler, but it still starts thermal throttling under sustained load.

With a 100% load, we again hit 100°C and started thermal throttling almost instantly. It sustains a slightly higher CPU frequency for a short while longer, but not that significantly.

In terms of fan noise. The new cooler runs a little louder at idle but is a little quieter under full load. So not a significant difference.

The new cooling solution makes some improvements to the thermals but if you plan on doing CPU intensive tasks for long periods of time then you’ll probably still want to upgrade the cooling solution.

Thermal-Photo-of-New-Cooler

Given how quickly the temperature spikes under full load, I think some of the limitation might actually be with the heat spreader as it’s a painted surface. There may also be issues with the way the heat spreader is interfaced to the CPU.

Heat-Spreader-Painted-Surface

Testing NAS Transfer Speeds

Now let’s get some drives installed into the ZimaCube Pro and do some transfer speed tests on it.

I’m going to install four 2.5” Crucial BX500 SSDs in the main bays and four Crucial P3 Plus NVMe drives in the M.2 bay.

These aren’t ideal for a NAS but they’re what I have available for testing. If you’re going to be using drives in a NAS long term then make sure that you use NAS grade drives and preferably ones with DRAM cache.

Our drives now show up in our dashboard and we can set them up as individual drives or in a RAID configuration.

ZimaOS-Drives-Recognised

I’m going to go with RAID 5 and I’ll set up the NVMe drives in one pool and the SATA drives in a second pool.

ZimaOS-Raid-Configuration

I had to reformat the SATA drives as I used them for my Pi NAS and they were already in a RAID configuration that the ZimaCube Pro didn’t like.

NVMe Storage Volume Test

Using AJA System Test, and starting with a small 256MB file on the NVMe storage volume. We get writes of around 700MB/s and reads around 900MB/s. So writes are a little under saturating the 10Gb Ethernet connection but reads are very close. I haven’t done any optimisation or tweaked any settings on the NAS so this is straight out of the box with very little setup. With a 1GB test file we get similar results.

Speed-Testing-NVMe-Storage-Volume

Going up to a large 64Gb test file, we get a similar write speeds but our read speed drop off quite a bit, down to 750MB/s.

I’ll have to do some digging to figure out why, but it seems like it likely has to do with the ZimaCubes Pro’s available RAM. The ZimaCube Pro is probably not able to write to the NVMe drives fast enough to keep up with the network adaptor, so the RAM fills up and then the drive’s write speed becomes the bottleneck.

RAM-Filled-Up-During-NVMe-Storage-Volume-Test

NVME Volume Results Summary:

  • 256MB: 700MB/s Writes, 900MB/s Reads
  • 1GB: 700MB/s Writes, 900MB/s Reads
  • 64GB: 700MB/s Writes, 750MB/s Reads

SATA Storage Volume Test

Switching over to the SATA storage volume.

For the 256MB and 1GB test files we get very similar results to the NVMe drives.

Speed-Testing-SATA-Storage-Volume

When we go up to the 64GB test file, writes start off well but then drop quite considerably. We get about a third of what we got with the NVMe volume. Again this seems to be related to RAM filling up because the drives can’t keep up with the write speed.

Reading the 64GB test file from the SATA volume we get similar results to the smaller files so there are no issues here.

SATA Volume Results Summary:

  • 256MB: 700MB/s Writes, 900MB/s Reads
  • 1GB: 700MB/s Writes, 900MB/s Reads
  • 64GB: 250MB/s Writes, 750MB/s Reads

Real-world Window’s 11 File Transfer Test

Running some real world file transfer tests in Windows 11, we get good results.

Realworld-Windows-File-Transfer-Test

Copying a 60GB video file to the NVMe storage volume, we get a very stable sustained 1.1GB/s.

Copying the same file from the NVMe storage volume, we again get a fairly stable 1.1GB/s. There were two occasions where the transfer speed dipped significantly, but it picked up again fairly quickly.

Thunderbolt 4 Transfer Test

When directly connected to my Mac over Thunderbolt, I tried that same three file size transfer tests.

Thunderbolt-4-Transfer-Speed-Test
  • The 256MB test file averaged around 1200MB/s writes and 1500MB/s reads but results were quite erratic.
  • The 1GB test achieved very similar results to the 256MB test file.
  • The 64GB file started off writing quite slowly but ramped up to 1000MB/s write. It then achieved faster reads than the previous file sizes, getting up to 1700MB/s – fairly close to the claimed 2000MB/s.

Other ZimeCube Pro Features

The ZimaCube Pro can also run virtual machines, so you can run multiple operating systems to support different applications and utilities.

Windows-10-Running-Virtual-Machine

Plex running on the ZimaCube Pro handles 4K video playback really well. This obviously depends on how much transcoding is taking place, which is minimal for these sample videos, but they barely register on the CPU.

Plex-Running-On-ZimaCube-Pro

Some early users have already developed their own software for the RGB lighting in the 7th tray. It’s fully programmable through the onboard ESP32 module and each of the LEDs are individually addressable, which gives you a lot of options.

Power Consumption Tests

In terms of power consumption, the ZimaCube Pro is rated for up to 220W.

I did two tests as the total power draw will likely be quite dependent on the type and number of drives you’ve got running.

The first test was without any storage drives installed. With this setup I got a power draw of 27W when idle and it peaked to 81W when under full CPU load.

With four SSDs and four NVMe drives installed, I got an average idle power draw of 34W. This isn’t much of an increase from the 27W without drives, but SSDs are obviously a lot more power-efficient than physical disk drives.

Power-Consumption-Idle-4-SATA-and-4-NVMe-Drives

With the 3.5” bays all populated with physical 3.5” drives, other users have measured around 50-60W with low CPU utilisation and 70-80W with high CPU utilisation. So still well under the rated consumption, but its going to cost a bit to have this running in your home 24/7. Here in Australia I’d be looking at about $8-10 a month to keep the ZimaCube Pro running with my current configuration.

Final Thoughts On The ZimaCube Pro

Overall I think the ZimaCube Pro is a great way to get started with running your own Personal Server in your home or small office. ZimaOS as it currently stands is a bit limited but they’re constantly adding new features to it and it’s simplicity makes it really easy to get a basic setup running, especially if you’re new to running a NAS or home server.

If you outgrow ZimaOS you can also easily transition to a more powerful NAS operating system like TrueNAS or Unraid.

ZimaCube-Pro-Unboxed

There are a couple of things that I think they need to work on.

While the new CPU cooler is better than the original, there is still a lot of room for improvement. As I’ve said earlier, this may not even just be the cooler, it’s likely the CPUs heat spreader as well. The drive bay fans are also quite noisy and their curves are set a bit too low.

The removable front panel is great for aesthetics but could also do with some tabs to make it easier to remove. You have to sort of hook your fingernails in under the vents to pull it off. It’s not difficult to do but just feels clumsy.

Front-Cover-Removal-Issues-ZimaCube-Pro

Let me know what you think of the ZimaCube Pro in the comments section below and let me know if there is anything else you’d like to see me test or run on it.

Personal Cloud Server Using A Pi 5 – Made With The Omtech Polar

The cost of cloud services might not be that significant for one month, but the recurring costs quickly stack up. In a couple of years you can easily be out of pocket over a thousand dollars. So today we’re going to be building our own personal cloud server to bring these services in-house for a single upfront cost and take back full control of our own data.

To do that we’re going to be using a Raspberry Pi 5 with an active cooler on it and a connected NVMe storage drive. To protect the Pi and drive, we’re also going to design and build a custom enclosure for it and I’ll be doing that using the Omtech Polar laser cutter and engraver.

Raspberry Pi 5 With Active Cooler

Omtech sent me this new laser to tryout and share with you, so I thought the best way would be through a project that showcases its cutting and engraving capabilities on a range of materials.

Omtech Polar Laser Cutter & Engraver

Here’s my video of the build, read on for my written guide;

Components Required To Build Your Own Personal Cloud Server

Equipment Used

Omtech Polar Unboxing and Setup

To start, let’s get the Omtech Polar unboxed and set up.

Omtech Polar Unboxing

The Polar comes in a wooden crate and includes everything you need to get it set up and running. It even includes a rotary axis for engraving cylindrical objects and a materials pack to help you get started with some basic projects.

It is a fully enclosed design that contains the smoke and fumes while cutting and engraving, so it includes a ducted ventilation system to draw the smoke out of the machine and exhaust it outside. The full enclosure is also much safer for the operator than more common open gantry-style lasers.

The Omtech Polar doesn’t require much in terms of setup. It comes pre-assembled, so all you need to do is connect the ventilation system to the back and connect it up to your computer via a usb cable or through your home network via Ethernet or WiFi.

Secure Ventilation System To Back Of Polar

The build quality is also quite good. It’s an all-metal enclosure with a thick glass top and lid. Both axes run on linear rails and all of the cabling and air tubes run in a drag chain.

It also comes with some great features like integrated water cooling for the laser tube, built-in air assist and a 5MP camera to assist with positioning your artwork.

The working bed area is 500mm x 300mm, but it can accommodate larger materials using its pass-through tray.

Work Bed Area 500 x 300mm

Now that the Omtech Polar is set up, let’s get the case designed so that we’ve got something to cut and engrave on it.

Designing The Cloud Server Case

I used Fusion360 to draw up this case using the sheet metal designer.

Case Design

I chose this design feature so that we can open up the main body of the case into a flat pattern to cut out from a single piece of acrylic.

Flat Pattern Acrylic Body

This sheet is then bent to form a rectangular tube and a front and back panel finish it off. I’ve also added a clear panel to one side and I’ll make the small side panel insert and the front panel out of walnut plywood as some accent pieces.

Making The Cloud Server Case

To cut out the components, I’m going to be using Lightburn to control the laser. Omtech include a copy of RDWorks with the Polar on a USB drive as a free option, but I already use Lightburn quite a lot on my other lasers, so I’m going to use it for this project too. It’s great that this is an option on the Polar as some systems like the Glowforge lock you into using their own proprietary cloud-based software with no alternatives.

Raspberry Pi 5 Cutting and Engraving Template

Before running the laser, remember to always use proper certified protective eyewear suitable for your laser type when working with these machines, even fully enclosed ones.

Wear Proper Eye Protection

As a quick test to start with, let’s engrave and cut a dummy Raspberry Pi that I’ll use to test fit the case’s bends.

Raspberry Pi Template

The Omtech Polar has a somewhat automated focusing system. You focus the laser using the distance parameter in Lightburn which tells the laser how high to position the head above the material. When set to 17mm, the laser focuses on the surface of the work bed. You then subtract the material thickness from this to get the focus height. I’m using 3mm plywood for the test piece, so the focus height needs to be set to 14mm.

Omtech Polar Focusing

The quality of the cuts and engraves is pretty good, although it doesn’t seem like there is a way to turn the air assist off through software or hardware. This means that the engraving smoke is blown down onto the surface of the wood, which leaves the smoke marks you can see around the engravings.

I’ll use masking tape on the engraved wooden side panel to stop this when we engrave the Pi logo.

The main enclosure flat pattern and back panel are cut from 2mm black acrylic.

Black Acrylic Components Cut Out

Then let’s cut the window from some 2mm clear acrylic.

Clear Acrylic Components To Be Cut

Lastly, we can cut the front and accent panel from a piece of walnut plywood. It’ll engrave the Raspberry Pi logo onto the side panel before cutting the panels out. I’ve put masking tape over the area where the Pi logo is going to be engraved so that the engraving smoke doesn’t mark it.

Walnut Plywood Components Being Cut

Now we’ve got all of the case components made up.

Case Components Cut Out

To bend the acrylic I’m using an acrylic bending tool, I’ve added little notches to the flat pattern as guides for the bend lines.

I know it looks like a hair straightener but you can get these bending tools from Aliexpress for about $15 (or from Amazon for a little more) and they work well for bending small sections of acrylic.

Acrylic Bending Tool

With the four sides bent, it’s starting to look like a Pi case.

Case Body Complete

I’m not going to pretend that I got this right on my first go, actually far from it. I cut this flat pattern out about 8 times before I got the bends in the right places without breaking one of the thin sides alongside the clear window.

Failed Case Bends

Next, I’m going to give the walnut panels a quick coat of varnish to bring out their colour and seal them before gluing them into place.

Then to finish the case off, we can glue the front section and side panels into place with some superglue. I’ve mounted the dummy Pi into the case while doing the front panel to help with alignment.

Dummy Pi and Super Glue

I’ve also 3D printed some small 90-degree pieces to help with supporting the front and back panels.

I’ve also glued the feet onto the bottom, using two of the feet circular cutouts to make up each complete foot.

Feet Being Glued Into Place

The back panel is held in place with some brass inserts in the 3D-printed parts and M2.5 button head screws to secure it. This panel has to be removable to get the Pi in and out of the case.

Back Panel Removable With M2.5 Screws

I’ve also added a 40mm fan to the back panel to push air into the case and out of the vents at the front. This is held in place with some M3 button head screws and M3 nuts.

Pi Fan Installed On Back Panel

And that’s the case complete, now we just need to install our Pi stack and load our software onto it to turn it into our personal cloud server.

Installing The Pi 5 and Software

I’m using a Pi 5 with a Pimoroni NVMe hat underneath it.

Raspberry Pi 5 and NVMe Stack

I’ve got a 1TB Lexar NVMe drive plugged into the hat. You can use a larger or smaller drive, or even a duo hat for two drives if you’d like additional storage capacity.

Pimoroni NVMe Hat and Lexar Drive

The Pi stack is held in place on some M2.5 brass standoffs that screw into the standoffs supplied with the Pimoroni NVMe base and then some M2.5 button head screws through the bottom of the case hold it in place.

Securing Pi Stack In Enclosure

For software, I’m going to be using Nextcloud.

Nextcloud is an open-source software package that takes care of all of the backend work. To install it, we just download the prepared OS image for a Pi 5 from their GitHub repository and then flash it onto our NVMe drive.

Flashing Nextcloud To NVMe Drive

We then plug in our network cable and add power to the Pi to boot it up.

First Boot Of Pi and Join Network

Setting Up and Using Nextcloud

Leave the Pi to run through its first boot and setup process for about 5 minutes and you can then access it by going to https://nextcloudpi or https://nextcloudpi.local on a browser on a computer on the same network as the Pi.

The first time you go to this address you’ll be redirected to an activation page and you’ll need to copy the displayed temporary passwords to log in to your cloud server.

NextCloud Pi Activation Page

After clicking the Activate button on the bottom of the page, you’ll be prompted to login and then run through a quick setup wizard.

Nextcloud Pi First Run

You’ll then land on your Nextcloud dashboard. This looks quite similar to Dropbox, iCloud or Google Drive and is fairly intuitive. You have a root directory which you can add folders to and you can then add and start sharing your files.

Nextcloud Dashboard

Since it’s on your local network and isn’t limited by your internet upload speed, it’s quite fast too. We can copy a 750MB video file to the cloud in under 10 seconds.

Uploading Large Files To Nextcloud

We can set up file or folder sharing with other users, similar to other cloud services. It’s also quick and responsive opening up other media like photos.

Photos Opening and Closing

There are a few other features like contacts, a calendar, notes and tasks. So Nextcloud does quite a lot more than just file storage. You can configure how these work and allow them to be shown on your Nextcloud dashboard.

Final Thoughts On The Build and Omtech Polar

And that’s really all there is to it. You’ve now got your own personal cloud storage server running on your home network.

Raspberry Pi Nextcloud Pi Server

To access your server outside of your home network, you need to set up port forwarding on your router. I’m not going to go into this in this guide as you can open your network up to security issues if you don’t do it correctly, but it is an option so that you’re not limited to using it only within your home.

I’ve currently got mine set up to share documents and photographs with other users on my home network and I have a couple of digital photo frames that make use of the photo library on my server.

Omtech have recently started selling the Polar in Australia, so you can now get free and fast shipping within Australia too. The Polar looks like a well-built machine and Omtech have built up a good brand name, so I’m confident that I’ll be using this laser as my primary one on projects going forward.

I’ve already done some more testing on different materials to get a feel for the correct settings to use.

Let me know what you think of my Nextcloud Pi build and if you have any questions on the Omtech Polar in the comments section below.

CrowView Note 14″ Workstation – Unboxing and Review

The CrowView Note is a new laptop-style, self-powered portable monitor with a keyboard, trackpad, microphone and speakers built in. It has been designed to be quickly and conveniently connected to a Raspberry Pi, Jetson Nano or other single-board computer or mobile device and can also be used as a terminal for mini PCs or gaming devices.

Let me start off with a quick disclaimer that this is an early prototype that Elecrow sent over for me to try out and share with you. They only launch their crowdfunding campaign later this month, so there may be changes between this and the device that is eventually shipped out.

Here’s my video review of the CrowView Note, read on for the written review;

Where To Buy The CrowView Note

Equipment Used

Unboxing And First Look

The CrowView Note comes in a white branded box. Inside the box is the CrowView Note, a power adaptor and two adaptor boards, one for a Raspberry Pi 5 and one for an NVIDEA Jetson Nano.

The body of the CrowView Note is plastic, although it is finished to look like aluminium. It would have been great to have an all-metal shell but I presume it’s plastic to keep the cost down.

CrowView Note Body Is Plastic, Made To Look Like Aluminium

It weighs a little over 1.1kg (2.5lbs) and has a built-in 5000mAh battery. We’ll see what this looks like in terms of battery life in a bit as this will depend on what it is powering.

Weight Of CrowView Note

First up, the display is a 14” 1080P IPS panel with a refresh rate of 60Hz.

Display Is 1080P IPS

Then we’ve got a keyboard and trackpad. The keyboard feels fairly decent and comfortable to use. The key presses are good. The frame is quite thin so there is a bit of flex if you push down hard on the keys. It doesn’t feel like a high-end keyboard, and it’s similar with the trackpad. You can’t click on the trackpad at any position, the left and right click are only at the front of the pad.

It’s got a range of function keys like most laptop keyboards. These keys control the volume and backlight, turn the trackpad on or off and provide some media controls.

Function Keys and Indicator Lights

Above the keyboard are some indicator LEDs. We’ve got a status indicator that is on when the CrowView Note is powered on. Then a Capslock and Numlock indicator and a microphone.

On the left side of the CrowView Note, we’ve got a USB port to connect the keyboard, trackpad and other IO. Then there is a mini HDMI port for the display input and a USB-C port to power the connected device at 5V and up to 5A. This was designed to match the Raspberry Pi 5’s power supply requirements.

Ports On Left Side Of Laptop

On the right side, we’ve got another USB-C port. This one is full-feature though. So devices that support USB-C DisplayPort can make use of a single USB cable to this port to charge the device and to connect to the CrowView Note’s display, keyboard, trackpad and other features. Next to that is a headphone jack that redirects the speaker audio. Then we’ve got another USB port. This port can be used to add peripherals to the connected device.

Ports On Right Side Of Laptop

Lastly, we’ve got a 3.5mm barrel jack. This is for power, provided by the included 12V, 4A adaptor. It would have been nice to have power supplied through a USB-C port but I’m glad they’ve gone with this rather than a dangerous non-power delivery 12V USB-C power supply that I’ve seen on some other devices.

Power Adaptor 12V 4A

This little hole next to the power port is an indicator LED that lights up red when the internal battery is charging.

Laptop Charging Indicator LED

The speakers are underneath the CrowView Note. Their product sticker covers what seems to be an NVMe drive access slot. So, I assume this is a laptop frame from an actual laptop that has been repurposed for this product, again likely to keep costs down.

Speakers On Bottom

We’ll open the bottom up later to take a look inside.

So that’s an overview of the hardware, now let’s get a Pi connected to it to try it out.

Connecting A Raspberry Pi 5 To The CrowView Note

Flexibility is what Elecrow had in mind when designing the CrowView Note, so instead of providing a proprietary port through which a carrier board can be connected, they have designed the interfaces through standard ports. This means that you can either use their carrier boards to connect specific devices and remove the need for cables, or you can connect devices without a carrier board by using standard cables.

Standard Ports To Connect SBC To CrowView Note

To connect our Pi to the CrowView Note we’re going to use one of the two included carrier boards. We have one for a Pi 5 and one for a Jetson Nano developer kit.

The Pi version connects to the USB-C and HDMI ports on the side of the Pi and then uses a jumper across to one of the USB 3 ports on the front.

Plugging Adaptor Board Into Raspberry Pi

The carrier board then plugs into the three ports on the side of the CrowView Note.

Pi Adaptor Board Plugged Into CrowView Note

They’ve added acrylic to the bottom of the adaptor board to support the Pi, so it’s not just hanging on the ports. It’s supported by the desk underneath it, even if you press down on it.

Pi Supported By Acrylic Underneath Adaptor

I’ve charged the battery overnight so I’m not going to add the power cable at this stage. Let’s see how it goes being powered solely by the internal battery.

Pressing the power button provides power to the Pi 5 to boot up.

CrowView Note Powering Up

Once booted up we’re on the desktop. The display looks really good. The image quality is great, the colours look accurate and the viewing angle is relatively wide for a laptop-style display.

Raspberry Pi Booted Up

As I mentioned earlier, we’ve got a number of function keys across the top row. I quite like the battery indicator. Pressing this key pops up with an indicator on the bottom right of the display that shows how full the battery is.

Battery Indicator Function Key

There is also a full menu with display settings like a traditional monitor.

Display Control Meny

The trackpad is quite good, you’ll need to turn up the pointer speed, but there is no input lag. You can do the usual tap-to-click but can only really physically left or right-click the mouse in the front third.

The stereo speakers are ok, they sound a little tinny at higher audio volumes, but it’s nice to have them included as an option. My video at the beginning of the post has a sample of the audio when watching a video.

Speaker Sound During Video Playback

The CrowViews power button is not connected to the Pis power button. So you need to shut down the Pi safely by either pressing its power button or doing it through software. You’ll then need to then press the CrowViews power button to remove power to the Pi once shutdown.

Removing Power After Pi Is Shutdown

We can also easily connect a Jetson Nano by using the second included carrier board. This is a bit quicker than with the Pi since all of the Jetson Nano’s ports are on one side already and we just have a jumper for power.

Adaptor Board Plugged Into NVIDEA Jetson Nano

My nanos carrier board is unfortunately a 9-19V board so I’m going to have to use external power as the CrowView Note only does 5V.

Jetson Nano Booted Up

Connecting Other Devices To The CrowView Note

It may seem a bit strange to have the Pi, Jetson or other computer outside of the laptop-style shell. Elecrow have made a Laptop-style computer for a Raspberry Pi previously which integrated it into the enclosure. This results in a very thick laptop base which reduces portability and access to the Pi, so I prefer this arrangement. This design also allows you much more versatility to use different single-board computers, mini-computers and mobile phones, all with minimal effort.

The adaptor board for the Pi 5 will also work on the Pi 4 if you want to connect one of those.

Adaptor Fits Pi 4 As Well

Or, as I mentioned earlier, you can also connect any single-board computer up using standard cables instead of the adaptor board.

Can Use Standard Cables To Plug Pi Into

If you’ve got devices like a mobile phone or tablet that supports a USB-C attached display, then a single cable provides another display and connects the peripherals.

Single USB C Cable Connect iPad To CrowView Note

Battery Life On The CrowView Note

In terms of battery life. I got about 2.5 hours when running the Pi 5 with an NVMe drive and a moderate load. You’d probably get close to 3.5 hours on a light load or idle on the desktop. The CrowView Note won’t safely shut down the Pi when the battery is empty, so you’ll need to keep an eye on it.

Raspberry Pi and Jetson Nano Connected To CrowView Note

They made the USB-C power port specifically to match the requirements of the Pi 5 but it would have been nice to have power delivery available for a variety of voltages. This would provide support for more power-hungry devices and fast charging charging, so you could use the built-in battery as a sort of power bank.

Inside the CrowView we can see the battery, which is a 7.4V battery, so that gives us 37Wh. It would have been great if they increased the capacity of the built-in battery to take advantage of all of the space in the enclosure.

Battery Inside CrowView Note

Final Thoughts On The CrowView Note

I think the CrowView note is perfect for someone who does a lot of travelling or works on a range of single-board computers fairly often as I do. It’s obviously not for everyone and isn’t going to suit someone who wants to use it as a traditional laptop-style computer. If you’re not making use of the ability to swap out the connected device often then this probably isn’t for you.

I’ve used this setup with a portable monitor and wireless mouse and keyboard quite a lot in the past and I think the CrowView Note would fit right in as an all-in-one replacement for it.

Display and Keyboard Setup That I Usually Use

It would also be great to see a wider range of adaptor boards be made available for different SBCs. I would be interested in one for Pi Zero 2 W.

A big part of whether this is going to do well or not is going to hinge on the pricing. It’s going to be competing against someone buying similar spec. hardware individually, so its price needs to be competitive with that. I’d say somewhere around $150 would be good value for money, but we’ll have to wait and see what they launch at.

CrowView Note 14 Display and Keyboard

This product is still in the Crowdfunding stage, which obviously comes with some inherent risks. Use your own discretion when supporting these types of projects. Elecrow have been around for several years and have successfully crowdfunded 5 other products, so they’ve been reliable so far.

Let me know what you think of the CrowView Note in the comments section below. What can you see yourself using it for?

Pironman 5 NVMe Raspberry Pi 5 Case Review

Today we’re going to be taking a look at new the Pironman 5 case by Sunfounder. This case has been designed to house a Raspberry Pi 5 along with an NVMe storage drive.

It’ll set you back almost $80, but it has quite a lot to offer. So let’s see how easy it is to put together, how the Pi and NVMe drive perform in it and whether it’s worth the money.

Pironman 5 Case In Box

This is the second generation of Pironman case, although it doesn’t look all that similar to the first. The first generation was made for the Pi 4, which didn’t have a PCIe port. This case takes advantage of the port to add an NVMe drive to the Pi without the need for an external USB jumper. They’ve also shifted from a two-opposite clear side panel design to a wrap-around clear panel design.

Pironman Case By Sunfounder

Here’s my unboxing and review video, read on for the write-up;

Where To Get The Pironman 5 Case

Unboxing and Assembly

The case comes in a fairly large white branded box. Inside it is the aluminium case shell packed full of the case components, including fans, cooler, expansion boards and mounting hardware.

Pinronman 5 Case Components Unpacking

Oddly, I did get a random open and empty sleeve in mine. I don’t know if that means something is missing or if it just found its way into the box but I guess I’ll find out when assembling it.

Empty Sleeve Included

You can see from the spread that there are a lot of components to this case. Like with the previous Pironman case, it’s going to take more than a couple of minutes to put it together.

Included Components

Assembling The Pironman 5 Case

To assemble the case, you start by adding standoffs to one enclosure half.

Installing Brass Standoffs

You then plug some of the carrier boards into the Pi, then mount the assembly into the case.

Mounting Pi and Adaptors To Standoffs

The power button and cooler go in next. They supply pads for the WiFi chip and power circuitry which is a bit different. Previously this cooler covered the RAM chips, USB and Ethernet controller.

Thermal Pads For Ice Cube Cooler

Next, we install the NVMe drive.

NVMe Drive Installed On Carrier Board

The NVMe drive carrier board supports multiple drive sizes from 2230 to 2280. I’m going to be using a 2280-size Lexar drive in the case. This is not all that fast as far as NVMe drives go, but it runs at gen-3 speeds, so it should get quite close to the maximum speed that the Pi can handle.

Drive Mounted Into Case

We then attach the fans to the back panel.

Mounting Fans On Back Panel

With that done, we can mount the GPIO expansion board with the OLED display.

IO Board With OLED Display

Then screw the two aluminium case halves together and stick the display onto the front panel.

Screw Panel Halves Together

To finish it off, we add the clear acrylic panels.

Peel Off Protective Coating On Acrylic

Assembly is relatively easy. The case comes with a good step-by-step illustrated instruction sheet and the hardware is labelled well too. It took me about 30 minutes to have the case complete and ready to boot up.

Pironman 5 Software and Web Dashboard

Booting up the Pi is done by pressing the power button on the front of the case.

Pironman 5 Case Completed

To get the display and lighting to work, we need to install some additional software. This is a fairly simple GitHub install by entering the following commands in the terminal:

cd ~
git clone https://github.com/sunfounder/pironman5.git
cd ~/pironman5
sudo python3 install.py

The script took about a minute to install and I didn’t run into any issues with it.

Installing The Pironman Software

The fans turn off and the OLED display comes on when the installation completes, but it still recommends a reboot. The OLED display on the front shows the CPU usage and temperature, the computer’s IP address as well as the RAM and storage capacity.

OLED Stats Display Pironman 5

Their software also has a really nice web dashboard, accessible by entering the IP address and port 34001 into the browser.

http://<ip>:34001

This dashboard allows you to see your system stats, plot graphs of the stats, see logs and access some of the case script settings.

Pironman 5 Web Dashboard

You can also change the style and colour of the RGB lighting, adjust its pulse mode or set it to cycle through different colours.

Testing The Case’s Cooling Performance

One of Sunfounder’s claims about the case is superior cooling. I personally use one of these tower-style coolers on my Pi 4s and 5s running in my 3D-printed cases. So I’ll use that as a baseline and see if the additional fans and this slightly different cooler base make any difference to the thermals.

My Pi Case Cooling Setup

I’m also interested to see how loud the case is when it is running as we’ve got three 40mm fans in it. The back two are not PWM fans, so they’re either on at full speed or they’re off. The default setting is for them to turn on when the CPU reaches 60°C, but this can be adjusted.

I’m going to adjust them to have the back two fans running at full speed for the stress test to see how it compares to my case’s thermals.

My case has a single 40mm PWM fan and I’ll set this to 100% for the comparison as well.

I’m going to be using CPU Burn to apply full load to the CPU and we’ll leave that running to see what temperature it stabilises at. I doubt we’ll get anywhere near thermal throttling with this cooling setup.

To install CPU Burn, we enter the following command into the terminal;

wget https://raw.githubusercontent.com/ssvb/cpuburn-arm/master/cpuburn-a53.S
gcc -o cpuburn-a53 cpuburn-a53.S

Then to run the test, we enter the following command;

while true; do vcgencmd measure_clock arm; vcgencmd measure_temp; sleep 5; done& ./cpuburn-a53
CPU Burn Running

At the stock CPU frequency of 2.4GHz, at idle, the CPU temperature started at 25°C. I left the test running for a little under 30 minutes and it stabilised at 52°C. So we’ve got a delta of about 27°C, which is really good. This means you’ve got a lot of headroom for overclocking.

Temperature Trend Over 30 Mins

The fans are not as noisy as expected, but they would be annoying if you had this on a desk next to you and they were running the whole time. I’ve put an audio clip of them running into my video at the beginning of the post. Thankfully, having them only turn on at 60°C means that they’re off most of the time and only come on when you put a demanding load on the Pi.

With the fans still on, the Pi’s CPU drops back down by 10°C in about 5 minutes.

Cooldown After Test Stopped

I ran the same test on my case. This time the starting temperature was 22°C, so 3°C below the Pironman case. The temperature reached equilibrium much faster but I left it running for 30 minutes too. It stabilised at 36°C. So we had a delta of about 14°C, which is significantly better than on the Pironman 5 case.

I think this is mainly down to the airflow path. My case design has airflow straight across the cooler and out some large ventilation holes on the opposite side.

Ventilation On Opposite Side

The Pironman case doesn’t really have air inlet vents and the fans at the back are each pushing air through a restrictive dust filter. So although it’s got two more fans than my setup, they aren’t working as effectively.

Testing The NVMe Drive Speed

To test the NVMe drive speed, I’m using James Chambers Pi Benchmarks script. This script favours random read-write performance, so is a good representation of how an OS would be using the drive.

To install and run the test, we enter the below command into the terminal;

sudo curl https://raw.githubusercontent.com/TheRemote/PiBenchmarks/master/Storage.sh | sudo bash

Over 3 consecutive tests, I got an average score of 51,963.

James Chambers Pi Benchmarks Script

This is a similar score to what I got using the Piromoni NVMe base with this drive on my setup, I got 51,902 and an average of 51,865. So that’s a good indication that there are no issues with the drive adaptor.

Pi Benchmarks Script on my Pi 5 Case

It is quite a bit slower than the speeds I got in my recent NVMe hat comparison, but that was done with a Sabrent Rocket drive which is much faster than the Lexar drive I’m currently using.

Final Thoughts On The Pironman 5 Case

Overall I think the case looks great and provides some nice functionality over the stock Pi. I particularly like the full-size HDMI ports over the stock micro HDMI ports and the fact that the cables run out the back of the case rather than on the back and side.

Pironman 5 Case Completed

There are also a few good improvements on the original design, assembly is quite a bit easier and it doesn’t rely on the side panels to hold the metal case components in place. So you can have the side panel removed to work on the Pi without compromising on the case’s structural integrity.

Rear of Pironman 5 Case

In terms of size, it is slightly larger than my 3D-printed case. It measures 112 x 117 mm and is 79mm wide, but they’ve managed to cram a lot of features into the small space.

Pironman 5 Case Size Comparison

It’s obvious that Sunfounder have tried to make this the best case for a Raspberry Pi 5. It comes with quite a high price tag, but in terms of value for money, it is fairly good. You can pick this up and you don’t need to worry about getting a separate cooler, NVMe hat, fans or an OLED display, and you still have access to the core Pi’s functionality like its GPIO pins.

It also comes with some nice finishing touches, like labels for the ports, and includes plenty of additional screws and mounting hardware so you’re covered if you lose some of them.

Included Additional Hardware

I never found anything missing from the case hardware so I assume that the empty sleeve I found made its way into the package by accident.

If I had to pick out some things to be critical about, I probably would have made the back fans PWM controllable as well. They would then run a bit quieter, although being able to turn them off most of the time in software partially helps with this already.

The dust filters on the back fans are also unnecessary as these are exhaust fans, so you’re filtering dust out that would be leaving the case. These would be better positioned onto some inlet vents and doing so would improve airflow through the exhaust vents.

Rear Fan Exhuast Dust Filters

Let me know what you think of the Pironman 5 case in the comments section below.

The New Beelink GTi 14 Ultra Has A PCIe x8 Slot

Beelink got my attention two weeks ago when they reached out about a new mini PC that they were about to launch, the GTi 14 Ultra, which has a built-in full-size PCIe gen 4 x8 slot.

Beelink GTi 14 Ultra PCIe Port

One of the biggest weaknesses in these mini PCs is graphics performance. They typically rely on an integrated GPU, and while some have reasonably powerful integrated graphics, they don’t come close to having a dedicated GPU.

I recently showed a way to get around this by taping into an M.2 port on a PC with an Oculink adaptor. This worked fairly well but was limited by the single PCIe lane and looked like a bit of a hack job. Even in a custom 3D-printed case. So I’m super excited to try out this new PC with a fully accessible PCI slot built in.

Here’s my video review of the GTi 14 Ultra, read on for my written review;

Where To Buy The Beelink GTi 14 Ultra

Tool & Equipment Used

  • USB-C Pencil Screwdriver – Buy Here
  • Infiray P2 Pro Thermal Camera – Buy Here

Unboxing The GTi 14 Ultra

The GTi 14 Ultra comes in a white Beelink branded box with minimal text on it. Let’s get it unboxed and see what is included.

Beelink GTi 14 Ultra Box

There are two versions of this PC that are going to be available. This is the less powerful Intel Core Ultra 7 version and it also comes in an Intel Core Ultra 9 version.

Removing the lid, we’ve got the GTi 14. It’s aluminium housing is protected by a matt plastic film.

Beelink GTi 14 Ultra Unboxing

Underneath the computer, we’ve got two cables. It looks like it’s got the power supply integrated into the PC enclosure, since we’re only provided with a power lead. This is a bit different to typical Mini PC designs which usually relay on a 19V laptop-style power supply to power them.

Power Cable Included, Supply Built In

So included in the box is the mini PC, a power cable, an HDMI cable and a short user manual.

Unboxed GTi 14 Ultra

The GTi 14 Ultra is a fair bit larger than most mini PCs I’ve tried previously, but that’s due to the integrated power supply, the full size PCIe port and the need for better cooling on the more powerful CPU.

You definitely get Mac Mini vibes from it.

Mini S vs GTi 14 Ultra Size Comparison

Another thing worth mentioning is that this mini PC doesn’t come with a VESA mount to mount it onto the back of a monitor. It’s designed to be placed onto a desk.

No Vesa Mount On Bottom

GTi 14 Ultra’s Specifications and Interfaces

The GTi 14 has an Intel Ultra 7 155H Processor, which is essentially a CPU, GPU and NPU all on a single chip. This is a mobile processor with 24MB of cache and 16 processor cores. It’s got 6 performance cores that can run at up to 4.8Ghz and 8 efficiency cores that can run at up to 3.8Ghz.

Beelink GTi 14 Ultra Unboxed

It has an integrated Intel Arc GPU with a maximum frequency of 2.25GHz and this supports hardware-based ray tracing, so we should be able to run some games on it to try that out.

It’s got 32GB of DDR5 RAM running at 5600MHz and a 1TB NVMe SSD.

In terms of IO, on the front, we’ve got a USB 3.2 port, a full-size SD card slot, a USB C port, a 3.5mm audio jack, a power indicator LED and a power button. The power button also has an integrated fingerprint sensor on it. The four holes along the top are for a microphone array suited to voice recognition.

Ports On Front

The two sides have nothing on them, they’re just bare aluminium.

Nothing On Sides

On the back, we’ve got our AC input, another USB C port (this one is a Thunderbolt 4 port), another 3.5mm audio jack, an HDMI port that can do 4K 60Hz, a DisplayPort that can do 4K 144Hz, two more USB 3.2 ports above two 2.5G Ethernet ports, and another two USB 3.2 ports alongside them.

Ports On Back

In addition to the 2.5G Ethernet ports, it’s also got WiFi 7 and Bluetooth 5.4.

The last port, and the one I’m most excited to try out, is the PCIe gen 4 x8 slot, which is accessible through the bottom of the PC under a thin rubber cover.

Ventilation Mesh On Bottom

Cooling is achieved by drawing air in through the mesh on the bottom and then exhausting it through these vents at the back.

Taking A Look Inside The GTi 14 Ultra

Next, let’s open it up and take a look inside. Four screws hold the bottom cover in place. With those removed, we can see two integrated speakers and a dust filter to protect the components.

Bottom Cover Removed, Speakers & Dust Filter

Under the filter is the speaker assembly and our power supply. These make it a bit more difficult to get to the RAM and SSD, but I’m going to go ahead and remove them so we can get a good look at them.

If you try this yourself, you need to remove the speaker assembly first. Once that has been removed, then remove the power supply. This is quite an interesting design. It gets AC power from the port at the back and then feeds 19V DC into the motherboard through these two standoffs. Lastly, remove the cover plate underneath them and then you’re in.

Then we can see our RAM and storage. The RAM is in a dual-channel configuration and is upgradable to a maximum of 96GB. There is also a second slot to add another M.2 NVMe storage drive. Alongside that is a removable M.2 WiFi adaptor and then our PCIe port. It looks like we’ve got a x8 and a x1 slot alongside each other.

First Boot and Benchmarking

Next, let’s close up the bottom cover and try boot it up. The GTi 14 Ultra comes with a clean install of Windows 11 Pro on it. Once set up, you can log in using the fingerprint sensor on the front, which is impressively fast.

First Boot Windows 11 Pro Login

I’ve had a look through the software and there doesn’t seem to be any preinstalled bloatware or spyware. You need to be careful buying mini PCs from Amazon or Aliexpress as they’re often filled with nefarious software. Beelink are a reputable brand and have been around for a while so they’re a safe bet.

Opening up the performance monitor we can see our CPU is an Intel Core Ultra 7 and it’s running at a base speed of 3GHz. We’ve got our 32GB of RAM running at 5600Mhz and our 1TB storage drive shows up as well. Our GPU is an Intel Arc and it’s sharing 16Gb of RAM.

Next, we’ll run two benchmarks on it. The first is Geekbench to test CPU and GPU performance.

The CPU benchmark took 4:30 minutes to complete and the fan was surprisingly quiet throughout the benchmark. You could hear it running, but it’s nowhere near as loud as some other mini PCs I’ve tested. We get a single core score of 2,270 and a multicore score of 11,834. So single core scores are fairly average, but the multicore score is good.

The GPU benchmark took just under 2 minutes and we got a score of 37,460. This is very good for an integrated GPU.

Next, let’s run Furmark to test the computer’s GPU and thermals. Like with the Geekbench benchmark, the fan came on almost immediately but wasn’t all that loud for the duration of the test.

On completion, we get a score of 1,920. Unfortunately, the GPU temperature was unable to be recorded.

Furmark Running

Gaming On The Stock GTi 14 Ultra

Now that we’ve run some benchmarks, let’s try running some games on it. We’ll start with Counterstrike 2.

It was handling the Home Screen pretty well so I set the graphics settings to Very High. In game, we get around 50-60fps, which is very playable. This is very good for a PC without a dedicated GPU and all settings on Very High. We’ll see how it compares when we add a GPU through the PCIe slot later.

Counterstrike 2 On Stock PC

Next, I tried running Doom Eternal. It had a bit of a freak-out when starting the game but it ran well after fixing the aspect ratio and setting the resolution back to 1080P.

Doom Eternal on Stock PC

I had all graphics settings on Ultra Nightmare and Ray Tracing turned off. In game, I was getting a little over 60fps fairly consistently, which is also really good.

Doom Eternal On Stock PC Gameplay

Turning ray tracing on didn’t make a significant difference to performance, we lost about 10fps.

So for 1080P gaming on this mini PC, you really don’t need an external GPU. It does very well with the integrated Intel Arc graphics. Both games were easily playable, but we’re going to try to see how the PCIe port performs in any case.

So let’s plug our GPU into the GTi14 and get the AMD drivers installed.

Gaming On The GTi 14 Ultra With An External GPU

I 3D printed a new bottom cover for the GTi 14 Ultra which has some mounting points for a PCIe riser to plug the GPU into. I’m using a XFX Radeon RX 6600.

External GPU Plugged In

Adding The GPU To The GTi 14 Ultra

From the photographs on Beelink’s product page, it looks like they plan on selling an external GPU dock which will plug into the GTi 14 Ultra’s PCIe port. This isn’t yet available so I had to make another plan.

GTi14-UItra eGPU Support

I 3D printed a new bottom cover for the PC. This picks up on the same cover plate screw holes but is offset with some M2.5x12mm standoffs.

3D Printed Back Plate

I then plugged a Coolermaster PCIe riser into the PCIe port and mounted the female port onto the 3D-printed cover plate with some M3x8mm screws into M2.5 brass inserts.

GPU Mounting Arrangement

To power the GPU, I used a 12V 10A power supply which I soldered to an 8-pin plug that I salvaged off a Molex to 8-pin adaptor.

Gameplay With The GTi 14 Ultra

In Counterstrike, with graphics again set to Very High, we’re now getting over 200 fps. This is 3 times what we were getting on the integrated GPU. The game is responsive and the PC seems to be running a bit cooler, the fan is noticeably quieter.

Next, let’s try Doom Eternal. Again with the same Ultra Nightmare graphics settings and Ray Tracing turned off, we’re getting over 150fps. This is about 2.5 times better than on the integrated GPU. I also noticed significantly faster load times with this setup.

Testing Power Consumption & Speakers

With the PC returned to its stock state with nothing plugged into the PCIe port, the GTi 14 Ultra uses around 30W when idle on the desktop. It maxes out at a little over 80W with the GPU and CPU being utilised during gaming.

The built-in speakers are a nice inclusion. They lack base because of their size but don’t sound terrible. They’re about on par with a mid-range laptop. You can hear a sample of the audio in my Youtube video.

Speaker Sound Test

Final Thoughts On The Beelink GTi 14 Ultra

Overall I think that this is a really awesome mini PC. It’s ultra-portable and having the ability to plug a GPU directly into it gives you the flexibility to use it for some fairly demanding gaming when you’ve got a bit more desk space and don’t need to carry it around. It’s also upgradable with non-soldered RAM and an additional M.2 port.

Beelink GTi 14 Ultra

I have two criticisms though.

One is that the integrated power supply doesn’t seem to go into a proper dormant or sleep state, even when the PC is completely shut down. In the below images, it had been off overnight and the enclosure was still noticeably warm. The power meter registers about 2.5W with the computer shutdown, so it’s using power for no reason.

The second is the implementation of the PCIe slot. It is very deeply recessed in the enclosure and the access slot through the enclosure is too thin for most standard risers. I assume that Beelink are going to release some sort of proprietary dock, but it would have been nice to have the slot easily accessible through the bottom cover with a standard riser cable.

I’m not sure what the pricing is going to look like as they’re not yet for sale at the time of writing, but I’d imagine they’ll be around $800 for the Ultra 7 series and likely $100 more for the Ultra 9 series.

Let me know what you think of the GTi14 Ultra in the comments section below. What would you use it for?

Raspberry Pi 5 vs Intel N100 PC – Which Is Right For You?

On a couple of my YouTube videos since the launch of the Raspberry Pi 5 last year, people have said that for the price of the Pi 5, you should just get an Intel N100 based mini PC instead. Most cite better video encoding and decoding performance, better OS support, more memory & storage options, and additional PCIe lanes as advantages over the Pi 5. So, today we’re going to compare the two and see whether an N100 Mini PC is a better option and what the limitations of each of them are.

Intel Alder Lake N100 CPU

If you don’t know what an N100 PC is. It’s a PC, often in a mini PC form factor, that is built around Intel’s Alder Lake N family, and in this case the N100 CPU. For a long time, Rapsberry Pi’s were substantially cheaper than any newly available Intel hardware, but Pi’s have since crept up in price and this series of processors are now cheap and efficient enough to close that gap to the point whether they’re becoming quite comparable.

Here’s my video of the comparison, read on for my write-up;

Parts Used For This Comparison

Pi 5 & N100 PC Setups Being Compared

For this comparison, I’m going to be using the following two setups.

The Pi 5 is an 8GB variant and I’m going to be booting it up from a Pimoroni NVMe base with a Lexar 500GB NVMe SSD. I’ve also added an official active cooler and power supply which, along with the NVMe base and storage drive, comes to a total of $160.

Pi 5 Setup For Testing

The N100 PC I’ve chosen is the Beelink Mini S12 Pro. This was on special for $159 when I bought it, so it was one of the cheapest options available on Amazon at the time. There were two cheaper options for $154 and $155 but I didn’t recognise either of these brands and I’ve used Beelink products before without any issues so I was happy to pay the extra $5.

Beelink Mini S N100 PC

So pricing between the two is really similar once you’ve added all of the required components to the Pi 5 and with the N100 we’re getting double the RAM and an included enclosure.

Pricing For N100 and Pi 5

In terms of basic specifications, the Pi 5 has a Broadcom BCM2712 SOC which has a 4-core Arm A76 processor running at up to 2.4Ghz. It’s also got a Videocore VII GPU.

Pi 5 Broadcom BCM2712 Processor

The N100 has 4 4-core Alder Lake N processor running only Intel E cores at up to 3.4GHz and integrated UHD graphics.

Beelink Alder Lake N100 Processor

Both of these computers have DDR4 RAM. The Pi 5 has 8GB running at 4267 MT/s and the N100 PC has 16GB running at a slower 3200 MT/s.

N100 Mini PC Inside

In terms of storage, both have a 500GB NVMe SSD.

Both computers have similar connectivity options – Gigabit Ethernet, two HDMI ports and four USB ports, although two on the Pi are USB 2.0 instead of all four being USB 3.0 like on the N100 PC.

They both have an M.2 port for an NVMe drive but the N100 also has a SATA port for a 2.5” drive and the Pi has a couple of other interfaces like dual 4-lane camera/display transceivers and a 40-pin GPIO header – we’ll discuss this in a bit more towards the end of the comparison.

The Pi 5 has a single PCIe lane that can run at gen. 3 speeds, to which the NVMe drive is connected. The N100 PC has a built-in M.2 port which makes use of 2 PCIe lanes also running at gen. 3 speeds. So we’d expect the storage speed on the N100 PC to be quite a lot faster than the Pi.

Perhaps the most significant difference between the two is that the N100 is an Intel X86-based system while the Pi 5 is an Arm-based system, so you’ve got far more options for compatible operating systems on the N100 PC than on the Pi 5.

M.2 Port On N100 PC

To make testing fair, we’ll be running Ubuntu on both since Ubuntu Desktop 24.04 is available as an officially supported OS through Raspberry Pi Imager and is available for the N100 mini PC as well.

Testing The Pi 5 & N100 PC

To compare the performance of the two, we’re going to run the series of tests below. These should give us a pretty good idea of the capabilities and limitations of each system.

  • Video Playback at 1080P in a Browser
  • A Sysbench CPU Benchmark
  • An NVMe Storage Speed Benchmark
  • GLMark2 GPU Benchmark
  • Power Consumption Test

Video Playback at 1080P

Let’s start with video playback at 1080P.

The Pi 5 struggled with this more than I expected it to. It stuttered badly and dropped a significant number of frames at the beginning. Even once playback settled, it still continued to drop frames.

Pi 5 Playing Bakc Youtube Video

From my experience, the Pi 5 handles video playback in Raspberry Pi OS, which is based on Debian, without any issues, so this is most likely a software issue.

The N100 PC had no problem playing back the 1080P video. Playback was smooth right from the start and was unaffected when running in the window or fullscreen.

N100 PC Playing Back Youtube Video

So both can handle 1080P video playback but the N100 PC is much better at it.

Sysbench CPU Benchmark

Next, let’s run a Sysbench CPU benchmark. I ran three tests on each computer and then averaged the scores.

I ran the following test on each of the computers;

sysbench --num-threads=4 --test=cpu --cpu-max-prime=20000 --validate run

The Pi 5 managed an average score of 40,359

Pi 5 Sysbench Results
  • Actual scores – 40907, 40023, 40148

The N100 PC managed an average score of 44,058

N100 Sysbench Results
  • Actual scores – 44022, 44096, 44056

So the N100 PC was about 9% faster than the Pi 5. This is not as significant as I was expecting given the much higher clock speed on the N100’s cores, but there is a small CPU performance gap between the two.

Pi 5 Sysbench CPU Benchmark

The N100’s results were also far more consistent than the Pi 5, which may suggest that the Pi encounters some sort of thermal limitations when running the tests in quick succession.

NVMe Storage Speed Benchmark

To test the NVMe storage speed, I used James Chamber’s Pi Benchmarks script. This script favours random read/write performance, so is a good representation of how an operating system would make use of the drive.

To run the test, enter the following command in the terminal;

sudo curl https://raw.githubusercontent.com/TheRemote/PiBenchmarks/master/Storage.sh | sudo bash

Over three tests, the Pi 5 managed an average score of 32,089 with average sequential read speeds of 423MB/s and average sequential write speeds of 241MB/s.

Pi 5 Pi Benchmark Results
  • Actual scores 31154, 32431, 32683
  • Actual read speeds 425, 432, 412
  • Actual write speeds 247, 239, 237

The N100 PC managed an average score of 44,803, so significantly higher than the Pi 5, with an almost 40% improvement. Average sequential reads were around 673MB/s and average writes 495MB/s.

N100 Pi Benchmark Results
  • Actual scores 45149, 44992, 44267
  • Actual read speeds 668, 669, 683
  • Actual write speeds 507, 491, 487

GLmark2 GPU Benchmark

The N100 has a much more powerful GPU, so I expect it to do a lot better than the Pi in our GLMark2 GPU benchmark.

This benchmark needs to be downloaded and built from source code, and is then run by entering the below command in the terminal;

glmark2
GLmark2 GPU Benchmark On Pi

The Pi 5 managed a score of 307.

Pi 5 GLmark 2 Benchmark

The N100 PC managed a score of 2070.

N100 GLmark2 Benchmark

So the N100 is over 6 and a half times faster than the Pi 5 in GLMark2, which is obviously a substantial difference.

Power Consumption

Lastly, let’s look at power consumption. This is where I have high hopes for the Pi to stand out.

At idle the Pi 5 uses around 3-4W, and this goes up to 8-9W under load.

The N100 PC uses quite a bit more power, using 8W idle on the desktop and up to 27W under load.

While neither of these figures are particularly high, it’s worth noting that the N100 uses nearly four times the power of the Pi 5. This probably makes little difference on mains but for battery-powered projects that are required to run for many hours or even a few days, the difference can lead to substantial savings in power supply hardware and batteries. This is not all that surprising – ARM computers are known to be power efficient, which is one of the reasons they’re so popular for mobile devices.

Conclusion & Final Thoughts

So, the N100 PC beats the Pi in almost every performance benchmark and comes in at a similar cost.

Test Results For Pi 5 and N100 Comparison

One of the main reasons that people list for getting a Pi over an N100 PC is the GPIO pins, and these are without question much easier to use on the Pi. The GPIO pins are literally available right on the board and there is a wealth of software and tutorials available to utilise them.

That doesn’t mean that you’re out of options for the N100 PC. Microcontrollers like an Arduino Pro Mini or Nano, or even one of these purpose-built Adafruit FT232H USB to GPIO breakout boards make it equally possible to connect tiny OLED displays, read in information from sensors or just work through an introductory flashing LED tutorial on a PC, with relative ease.

Adafruit FT232H Module

This is not as integrated as on the Pi and comes at an additional cost, but for a few dollars might be worth it if you’re just getting started tinkering with electronics.

So, if you plan on using the computer for automation or robotics with a reliance on the GPIO pins then the Pi 5 is the better option, but for experimenting with home server projects, running anything reliant on a GPU, or getting started with Docker or Kubernetes then the N100 mini PC is a great alternative.

I think Raspberry Pi have missed the mark a little with the pricing of the Pi 5. If you are just looking for a cheap computer to get into tinkering with electronics projects then you’re probably better off going for a base version of the Pi 4. This still has plenty of CPU power to run projects locally and you’ll have access to a similar set of IO to the Pi 5 but without the additional cost. After all, a big part of the initial attraction to Raspberry Pi’s was the $35 base price!

Raspberry Pi 4B

Let me know which you prefer and what your use case is in the comments section below.