Posted in Computers, Fedora, Linux

VMware Tools Cannot Find Kernel-Headers on Fedora 18 x64

I recently installed Fedora 18 x64 on VMware Workstation 9, and was unable to initially complete the VMware Tools installation using the same methods that I’d previously done many, many times with prior Fedora installations. The installer kept telling me that it couldn’t find the kernel-headers folder. I had installed the development tools with my Fedora install, and they were all up to date, so I was a bit puzzled.

Prior to installing the VMware Tools, you need to install the Fedora development tools if you don’t have them – or if you’re unsure, just check – otherwise the installer will complain that it cannot find something, and will ask you to provide a path.

The development tools needed are: gcc, make, binutils, kernel-devel, kernel-headers

I also recommend updating the existing kernel to match the versions from kernel-devel and kernel-headers.

  1. Update your kernel, restart the vm after the installation: # yum update kernel
  2. Install the development tools, restart the vm when finished: # yum install gcc make binutils kernel-devel kernel-headers
  3. Run the ./ script, accepting all the defaults (unless you know what you’re doing and want or need to change something)

If the script complains that it cannot find the location of the kernel-headers – and you verify that they are installed by typing # rpm  -qa, then you must copy the kernel-headers from one location to another. Find out which kernel you’re using with # uname -a. The current kernel on my system as of 01-19-2013 is 3.7.2-201.fc18.x86_64

Run the following command to copy the folder from one location to another location, which is where the installer is looking for those header files.

# cp /usr/src/kernels/3.7.2-201.fc18.x86_64/include/generated/uapi/linux/version.h /lib/modules/3.7.2-201.fc18.x86_64/build/include/linux/

If you’re running the script, you can type that path into the installer where it asks for the header-files location. If you’re not running the installer script, run it again, and it should find the path automatically.

Thanks to user jgkirk from the VMware forums for this tip. The original post that helped me can be found here.

Posted in Electronics

Unable to Scan for Networks HTC Inspire 4G

For the past few months I’ve been having some wi-fi and mobile network issues with my HTC Inspire 4G, Android OS is 2.3.3

Some of the problems were forgotten wi-fi networks, dropped network connections and not reconnecting unless I turned off and turn on wi-fi again, poor or no wi-fi signal, among others.

The problem got worse with time, up to the point where the wi-fi didn’t even turn on. As soon as I attempted to turn it off and on again, I got “error”. Restarting the phone seemed to help sometimes, but now it doesn’t work at all. I’ve also reset it to factory settings a few times. This too seemed to help at first, because the wi-fi started working again for a day or two, only to stop working all together shortly after. In addition of restoring the phone to factory settings using the phone itself, I also downloaded the software from the HTC site and did it that way too… worked for a day, but now not at all. Now I get a message “unable to scan for networks”, and sometimes I see “error” too when attempting to turn it on. The Wi-Fi MAC address shows “Unavailable”, when looking in Settings, About phone, Hardware information.

It doesn’t matter where I am; the wi-fi will not turn on at all. I have no idea what to do anymore, and can’t use the phone on AT&T’s network all the time since I don’t have an unlimited data plan, so the phone is kind of useless at this point. I rely on the phone’s wi-fi connection a lot, so it not working is a big deal.

After doing some Google searching, I see other people having the same problem, with different phones. Some were able to get the wi-fi working again after a reset, others are still disconnected like me. The fact that people are having this very same issue on different phones leads me to think that it may be software related, especially if the problem went away after a phone reset.

I’m not sure what could be wrong with my phone, but I’m thinking it could be a hardware issue… here’s my logic. If it was software, shouldn’t it get corrected after a phone reset? And, supposing that the software on the phone itself was corrupted and the phone was restoring a corrupted version of the software… shouldn’t the problem fix itself after doing a factory reset by using the HTC software? I could be wrong, since I haven’t found a definite answer to this issue.

I’ve been using an old 3GS iPhone in the meantime while I figure out what to do, and just cannot get used to an iPhone again.

One thing I know for sure… I don’t think I could go back to using an iPhone again…. SO limiting!

I’ve got to find a good AT&T smartphone to replace this one if I cannot find a fix.

Posted in Computers, Windows 8

Windows 8 Consumer Preview on SSD

I just installed Windows 8 Consumer Preview Build 8250 on my system after playing with it on a virtual machine for a while. While configuring my system, looking at different settings and such, I noticed that the defrag screen recognizes my SSD as a solid state drive, and tells me that it needs optimization.

I thought defragmenting a solid state drive was a bad idea? Windows 7 disables this by default if it recognizes the drive as SSD.

I’m not sure how different the defrag program on Windows 8 is from different versions, so I’m a little concerned about letting it optimize my SSD.

The SSD is the Agility 3 60GB. See image below.

Has anyone else come across this, and if so, what have you done?

Posted in Computers, Hardware

HP P212 Smart Array Controller Performance Test on HP MicroServer N40L

Here are the results of a speed test after installing the HP P212 Smart Array controller on the HP MicroServer N40L. The OS is installed on the 250 GB drive, which I moved to the ODD slot position and is connected to the motherboard via the internal SATA port. The BIOS was upgraded with a modded version to unlock the SATA ports from IDE configuration.

Network speeds top 125MB/s when copying or moving large files (4GB) such as ISOs or MKVs. Smaller files such as pictures, mp3s, documents copy slower from a networked PC running Windows 7 x64, connected to an 8-port TrendNet Gigabit switch, speeds range between 30MB/s and 40MB/s, but mostly in the 60-70MB/s range; sometimes higher, all depends on amount of files and their size.

The server is running Windows Server 2008 R2 Enterprise.

Additional specs:

  • 8GB ECC Kingston RAM
  • HP P212 Smart Array Controller with 256 MB and BBWC
  • 3 Samsung HD204UI disks on raid 5 connected to the P212 controller, 512KB stripe size, cache settings are 25% read & 75% write. Write-back enabled.

The system is configured and running. It only has the file and printer sharing roles installed,  indexing for certain folders (paused for the tests), CrashPlan Desktop+ (on sleep mode for the tests). The server was idle, showing no CPU or disk load during these tests.

The default settings for ATTO were used, only changing the drive to be tested.

Clicking on each image will open it up to its full size.

Test 1

Test 2

Test 3

Not sure why test 3 was the slowest, even after trying a few times. There was no load on the server at all.

If I’m not mistaken and from what I’ve read online, I believe that adding an extra disk to the raid 5 array should improve speeds, but so far, this is pretty satisfactory for me for now, especially when using “green” disks which run at 5900 RPM.

If anyone out there has any recommendations for getting the best performance from this raid controller, please share them with me. I’m not sure if the current stripe settings of 512KB or the cache settings offer the best performance for my needs, which are general file & multimedia server for a small home network. If anyone would like me to run any additional tests in ATTO by using different settings, let me know. Also please feel free to share any other mods or ideas you may have for the MicroServer.

I plan to post additional details on the installation and setup of the card, with some more pictures.

Posted in Computers, Networking, Windows Server

My Comments on an HP MicroServer N40L

I’ve been playing with an HP MicroServer N40L for a few weeks now. Initially, I installed Ubuntu 11.10 and tried it out for a few days, but wiped it and installed Windows Server 2008 R2. I plan to use this machine as a file and media server, so I’m exploring a few options out there to give me the most flexibility and allow me to do what I want.

While reading a few blogs and forum posts, I’ve noticed most people – or at least a lot of them – are using WHS 2011 on their MicroServer. I don’t have a license for WHS 2011, but do have licenses for Server 2008 R2, so that’s what I’m using so far. In addition to using it as a file server, I would like to do some minor virtualization, mainly to separate the main OS from the media apps.

After upgrading the RAM from its initial 2GB to 4GB, I tested ESXi 5, which ran okay (a little slow using local storage), but not being able to use local disks as pass-through disks for the virtual machines was a big turn off, so I discarded that idea.

I’ve been running Hyper-V Server – initially on a full Windows installation, but then decided to try it on Core, and so far it’s been great. One of the best things about Hyper-V that fits my needs is the ability to use a local disk as pass-through. This way, I can install the OS on a VHD and use pass-through disks for storage. Been testing this for a few days, and I don’t see much (if any) of a performance hit when copying files from the network. When copying large files from a networked PC to the virtual machine’s pass-through disk, file speeds range from 20-110MB/s+, really depends on the kind of file I’m copying. Large files such as ISOs or MKVs are the fastest to copy, mostly at 90MB/s+.

Not all is nice and pretty with Hyper-V on Server Core, as it initially requires a bit more work to get properly configured and running, but once it’s up, it’s a “set it and forget it” kind of thing. I may post my installation and configuration notes for Hyper-V on Server Core 2008 R2 from beginning to end in the near future.

Posted in Solaris, Unix

Shut Down Solaris 11 Express

Here’s a quick command on how to shut down – or power off – a Solaris machine. I tried this using Solaris 11 Express, but have also tested it to work on OpenSolaris and OpenIndiana.

From the terminal:

$ sudo shutdown -y -i5 -g0

This is what it means:

– sudo: Run the command with elevated privileges. Not needed if logged in as root

– shutdown -y: Confirm that you DO want to shut down the system

– i5: init level 5: Power off the machine.

– g0: (it’s not “go”, it’s gzero). Shut down the machine immediately without a grace period. Increase the number to delay the shutdown by n amount of seconds. I always use 0 seconds on my Solaris server.