03
Dec
2017

Vega 56 Introduction

I’ve finally upgraded the last piece in Flammenwerfer – my 2017 build using AMDs Ryzen. The GTX 960 served me very well, but alas, it had no place in an 8-core, 16 thread system. I opted to go for Vega 56 – this article will go into the details why I went with Vega 56 and my first impressions, and later on I will delve into the performance metrics.

The main problem with RX Vega 56 (and the 64) at the moment is the pricing. As of writing this article, prices seem to range from £450-550 (which is ridiculous). I was very lucky to spot a Powercolor RX Vega 56 for £380 – which is getting very close to MSRP levels. Since then, prices have gone back up – so I am very glad I went for the card when I spotted it.

A very tasty upgrade over the ASUS GTX 960 Strix (pictured above the Vega 56). I personally like the look of Vega 56 – though some have said they find it boring. The RADEON logo on the top side lights up red, with the absolute perfect brightness of red. In my build (pictured at the top), I have gone for a black and red theme, to which this card fits absolutely spot on.

Why Vega then? Why didn’t you get a 1070? My reasons, in no particular order:

  • My LG 29UM68 supports FreeSync
  • The reference cooler matches my build (no aftermarket coolers yet 🙁 )
  • I get to use all of my CableMod custom cables
  • The performance is awesome – more than plenty for my needs
  • New architecture – better support for new APIs (eg, Vulkan)
  • AyyyyMD (I am a bit of an AMD fanboy – supporting the underdog!)

My first initial gripes:

  • I’m fairly positive I’ve heard jet engines quieter than this card under load
  • The drivers are very beta at the time of writing
  • The power draw is a tad high (especially compared to a 1070)

Fortunately, the card is extremely quiet at idle, and the idle power consumption is very good (on par with the 960). The 650W power supply for this system is perfect – with everything maxed out (and I mean maxed, the most I have been able to record at the wall is 550W AC – about 500W DC). Usually whilst gaming I see about 300W.

As for the noise under load, it’s definitely noticeable. It’s more of a “whoosh” rather than a “whine” – although it can get quite loud, its not annoying. Wearing headphones or having the sound turned up in a game quickly removes the noise issue for me. I have, however, discovered the best way to get rid of the noise – and that is to use FreeSync.

From AMDs marketing slides on FreeSync

Holy balls. FreeSync. It’s the future, man. This LG 29UM68 has a variable refresh range from 40 to 75Hz – where essentially the monitor will refresh at the same rate that the GPU can render frames. This gets rid of tearing when V-Sync is disabled, and removes latency when V-Sync is enabled (since the monitor isn’t duplicating frames when the frame rate dips below 60Hz).

The trick, I have found, is to enable FreeSync in the AMD Driver, then use V-Sync enabled in games with the maximum refresh rate set to 75Hz. This then caps the frame rate to 75FPS and allows it to dip below without encountering stuttering and latency issues. Additionally, this means the GPU isn’t cranked out at 100% all the time – resulting in lower power usage, thermal output and fan speeds!

FreeSync is one of those technologies you have to see to believe. It’s a night and day improvement over a standard 60Hz experience. My monitor does 75Hz, which is immediately waaay smoother than 60, but you can find screens quite easily that will do 144Hz. G-Sync offers a similar feature-set to FreeSync, except due to it’s proprietary nature the panels all cost a significant portion more compared to the equivalent FreeSync displays. So you can either spend more on Vega and get a cheaper display, or spend less on an Nvidia card and pay more for the display…. nice.

In the next posts I will talk about the performance and overclocking!

 

16
Jul
2017

USB 3.1 10 Gigabit – Why?

My shiny new ASRock X370 Fata1ty Gaming K4 supports USB 3.1 10 Gigabit through a standard type A port and a fancy new type-C port. Having owned the new Samsung Galaxy A5 2017 for a while now (which has a type-C port) and enjoying the new standard, I decided to grab a USB 3.1 type-C SSD enclosure from Sharkoon to use as fast portable storage.

I put my Kingston HyperX Fury 240GB SSD in the enclosure and did a few speed tests over the range of USB standards:

Naturally I did not bother going below USB 2.0 since nobody in their right minds would use USB 1.1 for storage purposes.

Here we can see that we are bottlenecked even at USB 3.0 5 Gigabit/s at 420 MB/s. This makes sense since the SSD is rated at 6 Gigabit/s – its only until we get to USB 3.1 10 Gbit/s that we can realise the full potential of the HyperX SSD (which is rated at 500 MB/s). USB 2.0 is left in the dust at this point.

So we now can utilise an extra ~ 80MB/s out of an SSD. Whoop-de-doo. My question is – really how much does this affect things in the real world from USB 3.0 5 Gigabit/s? I have an install of Ubuntu Linux on the SSD that I will use to measure the difference in boot times. I can negate out BIOS POST times since I can use the GRUB boot menu to reliably time from that point:

  • USB 3.1 10 Gigabit/s: 16.40s
  • USB 3.0 5 Gigabit/s: 16.51s
  • USB 2.0 480 Megabit/s: 28.69s

Well there you go – you won’t see much difference going from 5 gigabits to 10 gigabits if you’re running 1 SSD in an external enclosure. It’s apparent however that USB 2.0 is a severe bottleneck in this case – so moving to at least 3.0 is definitely worth it. After that – not really.

The main selling point of that Sharkoon enclosure for me is the type-C interface. Before, I was using USB micro-B 3.0 – a truly awful connector that never should have existed in my opinion. Type-C is right now the best that you can get, and I look forward to it becoming ubiquitous in the future.

09
Mar
2017

The Samsung Galaxy A5 2017 – Introduction and overview

Well my venerable Asus Zenfone 2 died on me. The failure mode was quite interesting: the battery was reading about 5% when I plugged it in one day, and the charge immediately shot up to 50% and stayed there. I left it charging for a while but it never budged. I used it as normal and eventually it shut off. No “Low Battery” or “Android is shutting down” messages, just straight to black.

RIP 🙁

So I plugged it back in and left it to charge. I have a USB current meter that proved to be useful in determining the charge rate, and I knew it was charged when the current dropped to zero (there was no charge LED). All attempts at powering the phone up were fruitless. I can get it to vibrate once but the screen won’t come on. The phone will then sit for a bit and vibrate again, and after about 2 minutes it’ll vibrate again – like it’s in some weird boot loop. I have disassembled the phone, disconnected and reconnected the battery buy no dice. The phone seems to get fairly warm whilst its doing this boot loop stuff – so I reckon there is a glimmer of life.

I could also get it to show up in Linux as an Intel device connected over USB, but it continuously disconnects and reconnects. It does not mount any storage over the USB, so I have lost all the stuff that was on the phone (which is okay, because the lot of it is on Google Drive).

Anyway, if other people have any ideas/fixes I am keen to hear them!

Obviously, being a 23-year-old, I cannot function without a smartphone. Sad, I know, but there it is. I had to switch back to my Sony Xperia U, which was a painful process. Going from a 5.5 inch screen with an Intel Quad Core 4GB RAM to a 3.5 inch screen with a dual core ARM 512MB RAM is not a pleasant experience.

I thought this used to be a big phone back in the day

I was on Pay-As-You-Go with Three for a while, and I decided it was time to just get a contract phone. I decided to go with the recently released Samsung Galaxy A5 2017 since the online reviews were strongly positive. And I gotta say – it’s entirely deserved.

This is a seriously slick phone. Very glossy, which is very nice to look at – but it seems to pick up fingerprints very easily and can be a bit slippery to hold. Also, I would advise against putting this in a pocket with your keys (I am not sure on the scratch resistance).

Above you can see how glossy the device is (sorry for the grainy photos – I was using my tablet in low light).

The screen on the phone is an AMOLED display. It is, in a word, stunning. I don’t think I have ever seen another screen quite like this. It is 1080p with a pixel density of 424ppi – plenty for my needs. It’s sharp, and with the infinite contrast ratio, I get fooled into thinking the screen is off between scenes in films and TV shows. Each pixel provides its own source of light – this eliminates the need for a backlight. This leads to some incredible battery life and a neat feature where the screen can stay on, only displaying the clock and your notifications without killing the battery.

So it’s getting into the evening – I unplugged the phone at about 8am and did not plug it in at all today. I used it as a GPS twice (to work and back), whilst also using it occasionally for emails. Even with a heavy day, I have not yet managed to completely drain the battery. The supplied charger is a 9V fast charger too – it can take the phone from 10% to 90% in about 45 minutes. Awesome stuff!

As for the camera – well you can be the judge:

The rear and front cameras are both 16MP and can both record at 1080p (no 4K or slomo unfortunately). The great camera quality and screen really do make the images look awesome. Also pictured are some parts for my upcoming Ryzen build… sooooooooooon

This is just a short(ish) look into the Samsung Galaxy A5 2017 – so far I am very pleased with the device. I will be writing more about the performance and software in the upcoming weeks!

10
Feb
2017

A video editing review: Shotcut

Well if you saw my post yesterday, then you’ll know that I’ve switched to 21:9 goodness. This poses a bit of an issue for me, since I record some funny gameplay moments using Nvidia ShadowPlay and then I chop it up using Windows Movie Maker.

Already I can hear the hail of palms hitting faces, but there’s a couple of reasons why this worked for me. I’m only chopping up video, so I don’t really need anything complicated in terms of video editing software. Lets be clear – I’m not doing things like animations or titles (maybe the occasional text overlay). Movie Maker was just fine…

…until I realised that it’s maximum output resolution is 1920×1080. Boo. I’m now recording at 2560×1080, and downscaling to 1920×806 is just… nasty. I need something free and easy that can support the higher resolutions. I’ve managed to stumble across Shotcut.

First thing off the bat – the layout works great at 21:9! The timeline at the bottom is nice and long, the panels fit nicely at the sides with the main video feed in the middle. All of these panels can be moved about to your heart’s content. The editing experience on my PC is just fine, and it only took me a couple of minutes to figure out how to start editing down my videos.

It looks like you can output at 4K using the built in presets, but you may be able to go over that… I haven’t tried that yet but I might in the future. The Export tab is used to set up an encode, and the jobs get queued in the job list on the left hand side. Output speed isn’t as fast as movie maker but I’m not too bothered about waiting a tad longer. You can queue a whole plethora of jobs with “Pause” engaged, and then unpause the entire queue once you’re done editing. Let it churn and you can come back to a finished batch of videos!

So far so good for what I need to use the software for. I stumbled across something in the Settings menu however…

Wait, wait, wait. You’re telling me that this free, and open-source NLE has GPU acceleration?! That is neat! I currently have a GTX 960 4GB video card and that seems to be picked up and utilised just fine. Even the video exports are GPU accelerated. I decided to benchmark using a 3 minute ShadowPlay recording:

  • Disabled: 14m 24s
  • Enabled: 9m 8s

A lot of the video filters you can apply are GPU accelerated (like colour correction) – I was able to get the GPU usage to 75% with totally smooth playback (though by this point the video looked awful!)

Shotcut is available for Windows, Mac and Linux – though I have had the Windows version crash on me a couple of times. The GPU acceleration mode is experimental after all. I’ll do some classic Windows vs Linux for video editing in the future – now that a decent platform exists for both!

So, to round up – this is a good review from me! I have relatively simple needs when it comes to video editing, but Shotcut certainly meets them for me. Maybe I’ll try doing some more advanced stuff with it in the future – I’ll be sure to let you all know how it goes!

09
Feb
2017

Ultrawide 21:9 on Linux – LG 29UM68-P Review

Well well it has been a long time! I should really be writing more on this blog… Anyway! On with the show…

I’ve become rather… fed up of my old 1440×900 monitor by Samsung. In fact, it was the Samsung 953bw. I liked it for a number of reasons:

  • It had a 75Hz refresh rate – nice for gaming
  • The input response time was very good – 2ms made it a decent gaming monitor
  • It was widescreen… okay yes this isn’t that big a deal, but the screen I had before that one was 1280×1024, so it was an upgrade!
  • It had VGA and DVI in – and I even could run it from a HDMI <> DVI cable
  • It was cheap. £50 off eBay. No worries.

That was a few years ago now. I have freed myself from the constraints of small PC monitors and resolutions…

…I got myself an LG 29UM68-P. Yah, badass.

This thing is nearly three feet wide. I had some concerns about the size – but it turns out that this fits perfectly on my new desk. There are some things about this that I immediately like:

  • It uses an IPS panel – the colours are so much better (>99% SRGB)
  • The backlight is actually even (a problem on my old monitor)
  • It has one DisplayPort input, and two HDMI inputs. This is great for me – I have a few Raspberry Pi’s kicking about and…
  • …I can use Picture-In-Picture (PIP) to have my PC on one side and the Pi on the other, with 1280×1080 resolution each.
  • It has FreeSync support – something I will be using in the future when I upgrade my GPU.
  • The menu is easy to use and quick to get at all the settings for things like Game Mode.
  • The 5ms input response rate is pretty good, though I can tell it is a tad slower than the old 2ms. Not by much though, only a tad. Not enough to put my somewhat mediocre gaming skills to the test.
  • So much room for activities!

Now I know what you’re all thinking… maan this is only 2560×1080! 3840×1440 is where it’s at for Ultrawide 21:9 ratios!

To that I say: nay nay! For I have some reasons:

  • At 2560×1080 on this size panel, the pixel ratio is ~96 DPI. More on this below…
  • I can actually run games on it with a modest GPU
  • I don’t have a 4K capable Blu-Ray player, – so as far as media consumption goes, the 1080p vertical resolution is spot on.

So, to elaborate on the 96 DPI – this lets me run Windows and Linux with no scaling. Though it is getting better, Windows DPI scaling has many issues. On the Linux side of things, I can only find scaling levels that are whole integer numbers (1x, 2x, 3x…) – this is a bit of a pain since 1x on a high-res monitor is too small and 2x is often too large. Forget about 3x unless you’re running like 8K or something…

More about Linux – I am running Linux Mint 17.3 “Rosa”, based off Ubuntu 14.04 LTS. I had my concerns since I am running a relatively older version of Linux, but I was surprised when I booted the PC for the first time that everything worked immediately. I mean, not even any X config file editing, no going into the Nvidia control panel – it just worked. And no scaling required!

Gaming was a great experience – though I did have one or two games flat out not even recognise the wide aspect ratio, the ones that did work were simply glorious. I mean, check out Civ V BE!

Looking good I think!

I am using DisplayPort to run the monitor, though to get 75Hz I had to “overclock” it using the Nvidia Control panel in Windows. Its unfortunate that the 75Hz option isn’t just available as any other resolution is. I don’t really game much on Linux so I have left it at 60Hz – maybe I should figure out how to get it to 75… if I manage it I will post how I did it!

I do have a couple of nit-picks about the monitor however:

  • Glossy black is nice and all, but a bit fingerprinty IMHO.
  • The stand is a bit basic. You only get tilt control, that is it. It has a VESA mount though so you can put a better stand on.
  • The menu is controlled by a joystick. It is also the power button. Why LG, why?
  • You don’t get a DisplayPort cable in the box. Seriously. You get bundled a HDMI cable instead.
  • The power brick uses one of these:

Image result for laptop power cord

That’s about all I can think about for this monitor. It’s simply a well priced, great performing panel for the average gamer and user. I love being able to actually use complex software like video editing packages with plenty of space to manage work. It’s not going to be fast enough for a professional CS:GO gamer who need sub 1ms response times, but that’s not what I got this panel for. I got it for everyday use, watching films at glorious 21:9 cinematic scale and playing games. It meets those expectations extremely well – I give it a solid 9/10. LG, good show!

04
Jun
2016

2016 NAS: Power Consumption

So, the NAS is complete. I purposely bought low-power hardware – the 35W i3 chip, 5900 RPM NAS drives, 80+ Bronze PSU, etc. It’s time to have a look at what we accomplished.

I got an el-cheapo power meter from Maplins. It claims a 2.5% accuracy rating, which is fine for basic testing. I’m not about to fork out hundreds to see the difference of a watt or two!

So, I booted the NAS, and left it idle for about an hour and looked at the reading:

37W at idle – not bad at all! At this usage, it looks like it will use ~320 kWh per year – so you can calculate the running cost based on your electricity price. For the average UK electricity price, that’s about £34 a year – or £2.84 a month (assuming a fixed tariff).

I used the RockStor GUI to spin the drives down to see what the base system used:

11W is being used by the hard disks – at just under 4W/disk for the Seagate 4TB NAS drives. Not bad at all!

I then loaded up a Plex transcode (to sync to my phone that would take a while) and this is what I saw:

That is 65W. This is the highest value I observed, and is very good for a NAS that can churn out multiple 1080p video streams to several devices at once.

So that’s the hardware side of things out of the way! I should really get a UPS for this system since I plan on storing a lot of data on it, and I will probably grab a fourth drive to get the best value out of the storage space I have. Up next I will be talking about the OS I picked to use on this NAS: RockStor!

03
Jun
2016

2016 NAS: Seagate 4TB NAS Drives

So, now onto the most important part of this build. The hard disks. I had a conundrum: do I get the WD 4TB Red drives or do I go for the Seagate NAS drives?

I scoured Reviews-land, searching far and wide for a justifiable reason to go for the WD Reds over the Seagate NAS drives. The price difference was about £30 per drive, so with three drives that is a fair bit of margin. My past experience also pointed to Seagate drives for RAID – the one WD drive I’ve had has corrupted photos in the past. I will be using BTRFS – so I won’t have to worry so much about bit-rot.

Aaaaaaaaaaand the Seagate drives went on sale. So – here we are:

My, are these drives absolute beasts. They’re the heaviest hard disks I’ve ever had. And I’ve had a lot of hard disks.

Time for some performance numbers, eh? Let’s do some classic hdparm –tT /dev/sdx commands and see what we get:

hdparm -tT /dev/sd[abcd]

/dev/sda:
Timing cached reads: 14610 MB in 2.00 seconds = 7311.68 MB/sec
Timing buffered disk reads: 472 MB in 3.00 seconds = 157.30 MB/sec

/dev/sdb:
Timing cached reads: 14408 MB in 2.00 seconds = 7210.30 MB/sec
Timing buffered disk reads: 488 MB in 3.00 seconds = 162.54 MB/sec

/dev/sdc:
Timing cached reads: 19482 MB in 2.00 seconds = 9753.21 MB/sec
Timing buffered disk reads: 498 MB in 3.01 seconds = 165.59 MB/sec

/dev/sdd:
Timing cached reads: 15366 MB in 2.00 seconds = 7690.23 MB/sec
Timing buffered disk reads: 654 MB in 3.00 seconds = 217.95 MB/sec

So, sda, sdb and sdc are the three hard disks. sdd is the boot SSD (plugged into a SATA-2 port). We are hitting about ~160 MB/s on the hard disks. Also, the fastest hard drives in terms of sequential read speeds I’ve ever had.

These Seagate drives run at 5900 RPM, which is slower than the standard 7200 RPM you see on most consumer drives, and way less than the 10k or 15k RPM you see on server drives. The lower spindle speed mainly impacts access time, since the drive has to wait longer for the data to reach the read head. To test for access time, I like to use seeker.

Here are the results:

./seeker /dev/sda

Results: 79 seeks/second, 12.64 ms random access time

./seeker /dev/sdb

Results: 76 seeks/second, 13.01 ms random access time

./seeker /dev/sdc

Results: 78 seeks/second, 12.72 ms random access time

./seeker/dev/sdd

Results: 5520 seeks/second, 0.18 ms random access time

So that’s an average of ~78 seeks/second, or 12.79 ms of random access time. In comparison, the SSD is obviously laughing, and my consumer Seagate 500GB HDD’s get 64 seeks/second (15.6 ms). So these slower RPM NAS drives are doing better in terms of access time than the 7200 RPM consumer drives. Nice.

Okay, I know what you are saying. What about real-world performance? How fast is BTRFS RAID-5? Well, I can dd a file and get 250MB/s plus. That’s plenty for gigabit ethernet. What about random access? Well, I can test that using sysbench! I detail the settings I’ve used in this post. The results are below:

3x 4TB Seagate NAS BTRFS RAID-5: 394.62Kb/sec

3x 500GB Seagate Barracuda 7200.12 mdadm RAID-5: 762.55Kb/sec

NAS SSD Kingston SSDNow 300 120GB: 34.728Mb/sec

So yeah, the NAS drives in real world cases with BTRFS are not as fast as a mdadm array. I believe the checksumming that BTRFS could be playing a part in the access time. I would like to note that this is not representative of what I will be asking of the NAS – it will mainly service Plex movies (large files). It has more than enough performance for that task! If I need a fast space in the NAS, I have the SSD there anyways!

In terms of acoustics, the drives are fairly quiet. Not silent, but you can’t really hear them seeking very much. It is quite cool hearing them spin up, since they take a fair bit of time to reach full speed. In terms of temperatures, they don’t get warm at all. 32-33 degC is the highest I’ve seen from the middle drive in the stack, and they don’t seem to fluctuate that much.

All in all, these are awesome drives for the money and space that you get. I have 8TB of usable storage space that I know will take me a good while to fill! Tomorrow’s post will talk more about power consumption of the drives, so stay tuned for that! Ta-Ta for now!

02
Jun
2016

2016 NAS: The Logic Case SC-N400 (Norco ITX S4)

Next, it is time to look at what ties the 2016 NAS together. The case.

This is the Logic Case SC-N400. It’s a clone of the Norco ITX-S4. As far as I can tell, the two cases are identical. Anyway! On with the tour!

Above, I have opened the door (which can be locked) and you can immediately see the four hot-swap 3.5″ HDD bays. These are fantastic for quickly swapping out a hard disk if one is to suddenly die on you, and is very convenient to have when you want to add/swap/remove disks. Each bay has a power LED (blue) and an access LED (green).

The bottom portion of the front has the status LEDS, power and reset buttons, and two USB 2.0 ports. The USB’s are perfect for installing the OS (not that I used them – see the review about the motherboard here). There are three network activity LED’s just in case you need as many blinky lights as possible! +1 for blinky lights! The LED’s are very bright – keep that in mind if you plan on keeping this in a bedroom.

Here, you can see how the drives are pulled out. You unclip the bay by pushing in the blue tab, and then you can pull the bay out using the arm that springs out. Pushing the bay back in is also very easy – just push the bay back as far as it will go (reasonably gently, mind) and push the arm back into the bay until the blue tab clicks into place. They are a tight fit, so go firmly but gently.

You can also see one of the 4TB Seagate NAS drives. They fit just fine and you can see ventilation holes on the top of the drive bays to keep the drives cool. They seem to stay at about 30-35 degrees C – perfect for hard disks.

Above you can see where the drives connect to the system and the main cooling fan. There are four normal SATA connections and a single Molex power connector. There is not much room back here, so organising four SATA cables, a Molex and a fan cable is rather tricky! When it comes to cooling, the rear fan draws air from the front of the case, through the hard disks, and out the back. Some ventilation is also given for the rest of the system below the hard disks.

The case came with a rear case fan, but it was a 3 pin fan that ran at 5000 RPM. It moved a lot of air, but was very noisy. I have swapped it for an Arctic F8 PWM Rev. 2 80mm fan that I can control with the motherboard. That seems to be plenty for this system, and the fan can ramp up to full speed if things get warm.

This is the side view of the NAS. You can see the four SATA connections for the main hard disks, but we also have a spot for a 2.5″ boot drive! How thoughtful! Since this drive is just for loading the NAS OS (In this case, RockStor), using an SSD would be more reliable since it would see very little usage (unless you shared it’s storage). I have put a 120GB Kingston SSD in since taking the above picture since that 30GB Corsair drive is starting to show some age.

You can see the mainboard at the bottom – and you can see in the next picture how little room there is for the CPU cooler:

That low profile cooler has under a centimeter of clearance left. The stock Intel cooler does not fit. There are ventilation holes above the cooler so it can draw in cool air from above – and I don’t see the CPU going above 60 degC. This is also a reason I went for the low TDP i3 – I knew there would not be much room for heat to build up in that space, so reducing the power output as much as possible was necessary to ensure the system will stay cool.

A power supply came with the case. Yeah. I swapped it out for an 80+ FSP unit. The case accepts “Flex-ATX” or “Flex” power supplies. They are relatively common online if you have a look about.

Much better! Also, you can see the new SSD in this photo. Rewiring this thing with the new power supply was not fun, since the 4-pin CPU power connector is under the hard drive bays.

So, let’s do a conclusion. This is a small NAS case, with a lot of functionality to boot! It has the hot swap bays, status LEDs, sufficient cooling for low-power hardware and can fit four 3.5″ drives with a 2.5″ drive on the inside. With 8TB disks that you can get at time of writing, you could create a 32TB storage box with this thing. In something no much bigger than a couple of shoeboxes.

If I had to criticise the case, it would be that there is pretty much no way to cable manage this thing tidily (then again it is a small case) and that the cover is quite tricky to get back on. Apart from that, it is perfect for a little home NAS.

I give this a rating of 9.5/10, and if four bays isn’t enough for you – check out the 8-bay version from Norco!

14
Apr
2016

2016 NAS: Intel Core i3 4360T

So in this ongoing series, I am reviewing parts of my newly constructed NAS! I have introduced the whole system, and the rockin ASRock server board I am using – but now it’s time to look at the brains of the system:

IMG-20160328-WA0004

That is the Intel Core i3-4360T. I searched around for reviews of this product, but I could only find some for the full 4360 (non-T). This is a 35W TDP chip, and there are some good reasons I went for this:

  • Low power consumption under load
  • Low heat output

The case I am using requires that I use a low-profile CPU cooler. The stock Intel cooler does not fit in the case I’m using (A clone of the Norco ITX-S4). Nonetheless, I did get a stock cooler just in case it happened to fit:

That is the lowest profile stock cooler Intel did, and it is too tall. Instead, I went with the Akasa K25 low profile cooler:

With the fan speed turned down to minimum (~1000 RPM and barely audible), the CPU just about reaches 60 degrees Celsius. That is perfectly fine for me, and the system is going to be spending 99% of it’s time chilling at idle and serving files.

Right, that’s the physical aspects out of the way! Now, it is time to talk about performance!

Geekbench 3 (link):

  • Single core: 2816
  • Multi core: 6080

Comparison: My AMD FX6300 gets 2032 and 7976 respectively. Consider this: this CPU gets 75% of the performance of a six-core AMD chip for a similar price, but at a much, much lower TDP (35W vs 95W).

Considering the low TDP, we can look on Passmark to see the performance-per-watt:

  • i3 4360T Passmark score: 4599 points. 4599/35W = 131.4 points/watt
  • FX6300 Passmark score: 6343 points. 6343/95W = 66.8 points/watt
  • Celeron N2830 Passmark: 993 points. 993/7.5 = 132.4 points/watt

Wow. This thing very almost reaches the power efficiency of my Intel NUC! That is saying a lot, since that thing is a 7.5W part.

As for real world performance, it is total overkill if you are just using this for file-serving. This chip is perfect for a home Plex server. I have used it for on-the-fly transcoding to four devices at once, and it made it look easy. Plex gives you a rough idea that you need 2000 passmark points per 1080p stream, and this thing pushes 4600. Plenty for my needs. And if I do (somehow) need more, I can just upgrade to an i5/i7!

So, overall, this thing is a beast. It has hardware encryption support, virtualisation and all the modern extensions (AVX2 etc), which make video encoding faster. I give this a 10/10 and cannot fault it for the performance it gives. Stay tuned for the next review: the case!

02
Mar
2016

Quick Review: The Corsair K70 Mechanical Keyboard

Ooooh! Nice! I guess its time for a quick review of the Corsair K70 mechanical keyboard (non RGB version). The particular model I have features the Cherry MX Red keyswitches, which do not have a mechanical “bump”, but rather a continuous movement that is meant to be “better for gaming”. Personally, I really like these keyswitches! They are rather satisfying to type on, but they make quite a racket when you are typing at full speed.

Some more glamour shots:

You can choose three levels of backlight and an off setting. You can disable the start menu with a toggle switch near the indicator LEDs and also enable a custom lighting mode:

If you hold down the custom lighting button for three seconds you can enable and disable the lighting for individual keys – which might be nice for some games/applications that use a wide range of keys.

They also throw in some interchangeable keycaps and a key removal tool:

You also get keycaps for the 1-6 keys which are applicable to many first-person shooters. I also am a huge fan of the media section of the keyboard:

So we have the usual suspects, but the volume wheel is extremely nice. It is just perfect for setting the volume in-game just right very quickly, which is easier than using individual buttons. It also feels very premium!

Finally, a shot of the back side of the keyboard:

Here we can see some of the mechanical keyswitches, but also a response selector switch and a USB port. Now, the selector switch allows you to adjust the response time from 1 to 8 milliseconds, depending on how you like it. BIOS mode is for legacy BIOS’ if you need it, but I just leave mine in the 1ms position and I’ve had zero problems with it. The USB passthrough port is also very useful for pen drives or other peripherals (such as a mouse that has a short cable, or a wireless receiver that you can position close to the mouse instead of into the tower under your desk).

So, I had better round up this review! The build quality of this keyboard is awesome. It does not flex anywhere, not even a smidgin. The brushed aluminium is very nice but seems to be prone to fingerprints, and the included wrist-rest is a nice addition, since the bottom edge isn’t very big. Works very well in Linux as-well, no weird drivers to download (also neither in Windows. The Corsair RGB software does not work with this keyboard).

All in all, I give this keyboard a solid 10/10. An absolute beast that will definitely last me many years to come!