Posted in Coding

Best Blogging Platform for Coding

I’m looking to start blogging again, mostly to keep track of some code snippets of stuff I’ve been learning.

I’ve been thinking that I’d really like to have a blogging platform with syntax highlighting and markdown support, but that would also allow me to post plain-text paragraphs. I know there’s Github and Codepen, and of course, WordPress. I’ve even been thinking of using Dokuwiki, as I have in the past, to keep my coding notes and code snippets.

I don’t know what I’ll end up using, I guess I’ll continue to use Github and Codepen for now until I have time to look into something else.

Posted in Computers, Hardware, Server

New Server – Lenovo TS140

It’s been 5 years since I purchased the HP N40L, so it was time to upgrade to a newer, faster system. The N40L is a nice unit in a small package – although a bit slow for my needs.

Enter the Lenovo TS140 70A4003AUX. I found a deal on Newegg that I couldn’t refuse, so I immediately added it to the cart and checked out before changing my mind.

There are a few models in the TS140 series. The one I chose has the following specs:

  • Intel Xeon E3-1226 v3 3.3GHz
  • 4GB 1600MHz RAM
  • Intel
  • No HDD
  • No OS

For the full spec list, search Google for Lenovo TS140 70A4003AUX.


I purchased this server to replace my N40L, which was serving as my ESXi host. While the N40L did a great job, the 8GB RAM limit and slow CPU really limited this server to running a few VMs with light loads.

I installed ESXi 6.5 on the TS140, and plan to use it as my main (and only) ESXi host for my various testing VMs, my firewall/router, and as a testing/development environment.

Inside the Case

Immediately upon receiving it, I opened up the case to take a look inside. The cables were tidy, and it looks very clean.


I was mostly interested in an internal USB port for ESXi, and was disappointed to learn that this server does not have one. No problem, I’ll just use an external one.

There are 2 trays for 3.5″ hard drives, which can also be used for 2.5″ drives like SSDs with the addition of an SSD bracket. This is what I did, and I installed a spare 64GB SSD that I had. I will probably upgrade this to a larger SSD if necessary, but I think this should suffice for now.

The server also has an optical drive bay, which can be removed and in its place, a 2.5″ SSD or HDD bay can be installed. I may also go this route if I need the extra storage, but there’s no need for now, since all my storage is on a LUN on my Synology DS1512+.


There is only one network port, and this is shared with the  Intel AMT for remotely accessing and managing the server. I installed a 4-port Intel i350-T4 NIC that I had purchased for the N40L, so that takes care of the networking.


There’s not much that can be done with only 4GB of RAM these days, especially if the server is going to be used as a virtualization host. This server uses DDR3 ECC RAM. Since I was replacing the N40L, I removed the 8GB RAM that it had and installed it on the TS140, which is now running 12GB RAM on 3 slots.

This is not the best solution, however. The server maxes out at 32GB RAM and it has 4 slots total, so I’ll be replacing the RAM with some 8GB chips in the near future.

Power Supply

The power supply seems to use some special proprietary connector, instead of the regular ATX connectors that we commonly see on PSUs and motherboards. This worries me a bit, but I knew this in advance, so I’m going to live with it. The power supply is a fixed 285 watt Bronze – enough for my needs.


The server came with plenty of low-noise fans – all covered with grills. The system is pretty quiet and stays cool.

Power Consumption

This is another big reason why I wanted to upgrade the older server and I refused to use my old PC as a server. The TS140 sips power, and sits at around 25 watt on idle after ESXi has loaded. I never saw my power meter go higher than 50 watt. Now, keep in mind that this is without mechanical drives and only 1 SSD – I’m sure the power consumption will be higher after adding a few mechanical drives.


I’m very happy with the Lenovo TS140 so far. It is a huge upgrade over my previous server, and it serves my needs very well for now.

Having a server-grade CPU and a motherboard with remote out of band capabilities make this server well worth it.

Posted in Music

Apple Lossless vs Flac

Below is a very old post which I saved to my drafts folder over 6 years ago, according to the date on “Draft Saved.” I am posting it just for reference, as some things have changed and I no longer use a Mac 100% of the time and no longer consider Apple Lossless to be even an option.

After I wrote the draft below, I decided and ripped all my collection to flac using EAC and dbPoweramp. All the music files have been happily sitting on my NAS for the past few years – a bit lonely I’ll admit. I’ve been using Spotify and Pandora quite a bit for some time now and haven’t listened to my flac collection much.

Original draft date: Sometime in 2010

Just as I thought that I had already made a decision regarding which format to choose to rip my music collection, I find myself confused, looking at pros and cons, and considering starting all over again.

There are different and mixed opinions when choosing a lossless codec. A vast majority vote for flac due to its wide availability in both software and hardware, and also for being open source.

Apple Lossless, on the other hand, it’s tightly integrated with iPod, iPhone, iTunes, and the rest of the Apple software and hardware. It makes the most sense if you are strictly using Apple software and hardware (which I am, to some extent). Being a Mac user, I had originally chosen to go with Apple Lossless – there was no doubt about it.

But now as I’ve ventured into open source software and have been using Linux, I see Apple Lossless more of an obstacle and less of a “one-fits-all” decision.

In order to play Apple Lossless, I’ll need iTunes at the very least. There are other programs out there that play Apple Lossless, but iTunes is the most known.

Flac, on the other hand, has software available for Windows, Mac, and Linux. There is also a wide range of hardware products that will play flac files without the need to do anything special.

Posted in All Other Stuff


It’s been a while (3+ years) since I’ve posted anything on this blog, but this doesn’t mean that I’ve stopped breathing or that I’m living now under a rock. I’ve been quite busy over the past 3+ years, learning a ton, and experimenting with and trying new things.

I will make this post short, merely to say that I plan to pick up where I left off and continue adding. I was considering starting a  new blog from scratch, but I think that I have some valuable information here that I’d like to keep to reference to in the future. I’ve also noticed quite a few hits on some pages, proving that some of the information I’ve posted has been somewhat useful to those who have visited.

For now, this is my first post of the day, and I’m going to aim to post something useful at least 2-3 times a week – hopefully daily as I get used to writing again.

Posted in Computers, Fedora, Linux

VMware Tools Cannot Find Kernel-Headers on Fedora 18 x64

I recently installed Fedora 18 x64 on VMware Workstation 9, and was unable to initially complete the VMware Tools installation using the same methods that I’d previously done many, many times with prior Fedora installations. The installer kept telling me that it couldn’t find the kernel-headers folder. I had installed the development tools with my Fedora install, and they were all up to date, so I was a bit puzzled.

Prior to installing the VMware Tools, you need to install the Fedora development tools if you don’t have them – or if you’re unsure, just check – otherwise the installer will complain that it cannot find something, and will ask you to provide a path.

The development tools needed are: gcc, make, binutils, kernel-devel, kernel-headers

I also recommend updating the existing kernel to match the versions from kernel-devel and kernel-headers.

  1. Update your kernel, restart the vm after the installation: # yum update kernel
  2. Install the development tools, restart the vm when finished: # yum install gcc make binutils kernel-devel kernel-headers
  3. Run the ./ script, accepting all the defaults (unless you know what you’re doing and want or need to change something)

If the script complains that it cannot find the location of the kernel-headers – and you verify that they are installed by typing # rpm  -qa, then you must copy the kernel-headers from one location to another. Find out which kernel you’re using with # uname -a. The current kernel on my system as of 01-19-2013 is 3.7.2-201.fc18.x86_64

Run the following command to copy the folder from one location to another location, which is where the installer is looking for those header files.

# cp /usr/src/kernels/3.7.2-201.fc18.x86_64/include/generated/uapi/linux/version.h /lib/modules/3.7.2-201.fc18.x86_64/build/include/linux/

If you’re running the script, you can type that path into the installer where it asks for the header-files location. If you’re not running the installer script, run it again, and it should find the path automatically.

Thanks to user jgkirk from the VMware forums for this tip. The original post that helped me can be found here.

Posted in Electronics

Unable to Scan for Networks HTC Inspire 4G

For the past few months I’ve been having some wi-fi and mobile network issues with my HTC Inspire 4G, Android OS is 2.3.3

Some of the problems were forgotten wi-fi networks, dropped network connections and not reconnecting unless I turned off and turn on wi-fi again, poor or no wi-fi signal, among others.

The problem got worse with time, up to the point where the wi-fi didn’t even turn on. As soon as I attempted to turn it off and on again, I got “error”. Restarting the phone seemed to help sometimes, but now it doesn’t work at all. I’ve also reset it to factory settings a few times. This too seemed to help at first, because the wi-fi started working again for a day or two, only to stop working all together shortly after. In addition of restoring the phone to factory settings using the phone itself, I also downloaded the software from the HTC site and did it that way too… worked for a day, but now not at all. Now I get a message “unable to scan for networks”, and sometimes I see “error” too when attempting to turn it on. The Wi-Fi MAC address shows “Unavailable”, when looking in Settings, About phone, Hardware information.

It doesn’t matter where I am; the wi-fi will not turn on at all. I have no idea what to do anymore, and can’t use the phone on AT&T’s network all the time since I don’t have an unlimited data plan, so the phone is kind of useless at this point. I rely on the phone’s wi-fi connection a lot, so it not working is a big deal.

After doing some Google searching, I see other people having the same problem, with different phones. Some were able to get the wi-fi working again after a reset, others are still disconnected like me. The fact that people are having this very same issue on different phones leads me to think that it may be software related, especially if the problem went away after a phone reset.

I’m not sure what could be wrong with my phone, but I’m thinking it could be a hardware issue… here’s my logic. If it was software, shouldn’t it get corrected after a phone reset? And, supposing that the software on the phone itself was corrupted and the phone was restoring a corrupted version of the software… shouldn’t the problem fix itself after doing a factory reset by using the HTC software? I could be wrong, since I haven’t found a definite answer to this issue.

I’ve been using an old 3GS iPhone in the meantime while I figure out what to do, and just cannot get used to an iPhone again.

One thing I know for sure… I don’t think I could go back to using an iPhone again…. SO limiting!

I’ve got to find a good AT&T smartphone to replace this one if I cannot find a fix.

Posted in Computers, Windows 8

Windows 8 Consumer Preview on SSD

I just installed Windows 8 Consumer Preview Build 8250 on my system after playing with it on a virtual machine for a while. While configuring my system, looking at different settings and such, I noticed that the defrag screen recognizes my SSD as a solid state drive, and tells me that it needs optimization.

I thought defragmenting a solid state drive was a bad idea? Windows 7 disables this by default if it recognizes the drive as SSD.

I’m not sure how different the defrag program on Windows 8 is from different versions, so I’m a little concerned about letting it optimize my SSD.

The SSD is the Agility 3 60GB. See image below.

Has anyone else come across this, and if so, what have you done?

Posted in Computers, Hardware

HP P212 Smart Array Controller Performance Test on HP MicroServer N40L

Here are the results of a speed test after installing the HP P212 Smart Array controller on the HP MicroServer N40L. The OS is installed on the 250 GB drive, which I moved to the ODD slot position and is connected to the motherboard via the internal SATA port. The BIOS was upgraded with a modded version to unlock the SATA ports from IDE configuration.

Network speeds top 125MB/s when copying or moving large files (4GB) such as ISOs or MKVs. Smaller files such as pictures, mp3s, documents copy slower from a networked PC running Windows 7 x64, connected to an 8-port TrendNet Gigabit switch, speeds range between 30MB/s and 40MB/s, but mostly in the 60-70MB/s range; sometimes higher, all depends on amount of files and their size.

The server is running Windows Server 2008 R2 Enterprise.

Additional specs:

  • 8GB ECC Kingston RAM
  • HP P212 Smart Array Controller with 256 MB and BBWC
  • 3 Samsung HD204UI disks on raid 5 connected to the P212 controller, 512KB stripe size, cache settings are 25% read & 75% write. Write-back enabled.

The system is configured and running. It only has the file and printer sharing roles installed,  indexing for certain folders (paused for the tests), CrashPlan Desktop+ (on sleep mode for the tests). The server was idle, showing no CPU or disk load during these tests.

The default settings for ATTO were used, only changing the drive to be tested.

Clicking on each image will open it up to its full size.

Test 1

Test 2

Test 3

Not sure why test 3 was the slowest, even after trying a few times. There was no load on the server at all.

If I’m not mistaken and from what I’ve read online, I believe that adding an extra disk to the raid 5 array should improve speeds, but so far, this is pretty satisfactory for me for now, especially when using “green” disks which run at 5900 RPM.

If anyone out there has any recommendations for getting the best performance from this raid controller, please share them with me. I’m not sure if the current stripe settings of 512KB or the cache settings offer the best performance for my needs, which are general file & multimedia server for a small home network. If anyone would like me to run any additional tests in ATTO by using different settings, let me know. Also please feel free to share any other mods or ideas you may have for the MicroServer.

I plan to post additional details on the installation and setup of the card, with some more pictures.

Posted in Computers, Networking, Windows Server

My Comments on an HP MicroServer N40L

I’ve been playing with an HP MicroServer N40L for a few weeks now. Initially, I installed Ubuntu 11.10 and tried it out for a few days, but wiped it and installed Windows Server 2008 R2. I plan to use this machine as a file and media server, so I’m exploring a few options out there to give me the most flexibility and allow me to do what I want.

While reading a few blogs and forum posts, I’ve noticed most people – or at least a lot of them – are using WHS 2011 on their MicroServer. I don’t have a license for WHS 2011, but do have licenses for Server 2008 R2, so that’s what I’m using so far. In addition to using it as a file server, I would like to do some minor virtualization, mainly to separate the main OS from the media apps.

After upgrading the RAM from its initial 2GB to 4GB, I tested ESXi 5, which ran okay (a little slow using local storage), but not being able to use local disks as pass-through disks for the virtual machines was a big turn off, so I discarded that idea.

I’ve been running Hyper-V Server – initially on a full Windows installation, but then decided to try it on Core, and so far it’s been great. One of the best things about Hyper-V that fits my needs is the ability to use a local disk as pass-through. This way, I can install the OS on a VHD and use pass-through disks for storage. Been testing this for a few days, and I don’t see much (if any) of a performance hit when copying files from the network. When copying large files from a networked PC to the virtual machine’s pass-through disk, file speeds range from 20-110MB/s+, really depends on the kind of file I’m copying. Large files such as ISOs or MKVs are the fastest to copy, mostly at 90MB/s+.

Not all is nice and pretty with Hyper-V on Server Core, as it initially requires a bit more work to get properly configured and running, but once it’s up, it’s a “set it and forget it” kind of thing. I may post my installation and configuration notes for Hyper-V on Server Core 2008 R2 from beginning to end in the near future.