I am delighted to be able to finally announce that I am in the process co-authoring a book. At first the prospect of being involved in such a project seemed a bit daunting, but I very quickly realised that opportunities such as this are few and far between, and it is something that I have to take on board and make time for.
I am however not in a position to disclose any details around the project just yet, but I can say that it will be on virtualization and surprisingly enough, it will be on a subject that is not covered by any books that I know of at the moment.
If any of the many much experienced authors out there are interested in sharing some tips with me, please drop me an email or a DM tweet!! I would appreciate any advice or suggestions, as this is my first time authoring a book.
I’ve been getting quite a lot of emails asking what equipment I’m running in my home lab. Rather than having to reply to each and every email with the full inventory, I thought it would be a better idea to just post it on here.
Now, my home lab is nothing special. In comparison to the toys that some people have the pleasure of playing with in their labs, and I’m not mentioning any names (hmmm, Mike Laverick), my little environment is very cheap and simple, but it does the job well enough for me, so I can’t really justify any upgrades. Not at the moment anyhow.
Let’s start with the stuff you’re probably most interested in, the servers:
For VMware vSphere, I have two whitebox servers, each with the following configuration:
- AMD FX-8320E 8-Core FX Series CPU
- Asus M5A78L-M/USB3 Socket AM3+ Motherboard (Micro-ATX)
- 2x CORSAIR CML16GX3M2A1600C9 (32GB Dual Channel DDR3)
- 350W BeQuiet PURE L8 BN221 Power Supply
- Gigabyte GZ-MA02 Case (Micro-ATX)
- 1X Intel PRO/1000 GT Dual Port Gigabit Ethernet Adapter
- No local disk drives. Boot ESXi 5.5 from a USB stick
In addition, I also use two of the following specification machines in a two node "compute" cluster when testing vCD or vCAC deployments
- HP Proliant ML110 G5
- Intel Xeon 3065 @ 2.3GHz (FT is unfortunately not supported)
- 8GB RAM
- 1x Onboard Broadcom NC105i Gigabit Ethernet Adapter
- 1X Intel PRO/1000 GT Dual Port Gigabit Ethernet Adapter
- No Internal Hard Disk Drive
- Boot ESXi 5.5 from a USB Stick installed internally on the motherboard.
For shared storage, I have a Synology DS1512+, populated with 4 1x TB disks in RAID10. This configuration provides more than enough storage capacity for my requirements. In addition, the Synology DS1512+ also has support for VAAI.
The hosts are all connected via a 3COM (HP) Baseline 2924, 24-port gigabit managed switch.
Now I would advise that this is a non-production, test environment only. There’s not much resilience in there. I simply do not have any more space for any more server and network equipment. Besides, I costs a small fortune to power these things 24x7 anyway.
So that covers the servers. My “desktop” PC on the other hand is entirely another matter. Seeing as I really only buy a new PC every 5 or 6 years, and seeing that I’ll be using this machine predominantly in my spare time to run graphics intensive applications such as Flight Simulators, I really went all out on this one in terms of performance, power and stability:
- Cooler Master CM Stacker 830 Full Tower Case
- Antec TPQ-1200OC TruePower Quattro OC 1200W Modular Power Supply (PSU)
- ASUS Rampage III Extreme X58 Motherboard
- Intel i7 950 CPU (surprisingly the slowest component in the box)
- 12GB RAM: 6X Corsair Memory Dominator GT 2GB DDR3 1866 Triple Channel Modules
- 1X Samsung 840 PRO 128GB SATA III SSD (Operating System Install)
- 1X 64GB OCZ Vertex 2 SATA II 2.5" SSDs (Flight Simulator installs)
- 1TB Seagate SATA II drive for applications and data storage
- 1x Onboard Gigabit Ethernet Port
- M-Audio Revolution 5.1 Sound Card (PCI)
- XFX ATI Radeon HD 5970 Black Edition 2GB Graphics Card
Just a quick note on the case: I cannot stress enough just how large this case is. IT IS HUGE! I caught myself starring in amazement at the thing for about 20 minutes when it arrived. If you ever think about getting one of these cases or a similar one of that matter, make sure you have enough space to park this thing NEXT to your desk, because it’s not going anywhere else!
Just a quick note on the graphics card: The graphics card is the main reason for my decision to purchase the CM Stacker Case. I needed to get a case that would provide sufficient space for the card to actually fit into. The graphics card is a massive13 inches long and believe me when I say that it just about fits the case with maybe an inch or so to spare at most.
For stability reasons, this card also demands a certified power supply that is capable of a bare minimum 600W. As I have plans to install a second card and maybe even a third card in a cross-fire configuration later on, I thought it would be best if I make sure that I buy one power supply now that will provide sufficient stable power for future upgrades.
So I’ve had my HTC Desire HD mobile phone since it was released in October 2010. As I’m not one to follow trends, line an arrogant company’s pockets and make a fashion statement by opting to buy an iPhone 4, the HTC Desire HD really was the obvious choice for my next phone. Overall, I have to say it’s a brilliant piece of kit. Android is really easy to use and it does everything, well almost everything you ask of it very well. Then HTC put their spin on Android , they produced a device that really gives the iPhone 4 a solid competitor.
However, as I said before, I’m not one to follow trends, but I’m also not one who likes arrogant companies like Apple, and now Google. How can I say Apple is arrogant? Well easy! They charge an arm and a leg for a device that will only be good for 6 months to a year (if you’re one that follow fashion trends) before they replace it with “Version 2”. To make it worse, they don’t even give you an option to change simple things like the battery. However, this post is not to slam Apple, as I use a MAC myself with Final Cut Studio, a software package that is very powerful, easy to use, and compared to similar products from other vendors, rather cheap (and we all know the cheap word and Apple don’t go together all too often). No, this post is to slam Google for not listening to its customers when all we want to do is keep your text messages once we have received them.
Yes that’s right! All I want is for my text messages to live happily ever after on my Android device after I have received them! Is that too much to ask from a mobile device? For a 10 year old Nokia device that’s been dropped, kicked and lived life to its fullest? No problem. For a brand new Android device? Well, that could be a problem.
Many people, including non technical people will be asking why someone would be complaining about a new, state-of-the-art device’s functionality surrounding the rather ancient technology known as Text Messages and also known as SMS messages. But I have to say, it is rather annoying if your shiny new, all-bells-and-whistles mobile phone (or for you guys in the America, Cell Phone) randomly decides to delete all of your text messages. What’s more annoying than that is when it happens for a second time two months later. What’s even more annoying than losing more than a thousand text messages over a period of 3 months is the fact that the great search engine giant responsible for the Android operating system, Google doesn’t seem to be bothered about the issue. At first, I thought it’s an issue that only affects the HTC Desire HD handset, but a quick search on Google revealed that the issue affects most, if not all Android devices.
My mobile phone have now on two occasions randomly decided to delete all of my text messages! How? Well easy! I decide to send a text message, and when I open Text Messages on my phone, all my messages are gone!
When this happened to me again last night, I decided that I’ll jump on Google.com and have a look if anyone else is having the problem. Sure enough. It turns out that the issue has been around for a very long time. Many people have complained about losing thousands of text messages and in some cases photos and contact information and this has been going on since 2009.
The issue, known on Google Code forums as issue 5660, has been around for more than a year. Since 26 December 2009, a whopping 959 posts of complaints have been made regarding this problem on this forum thread: http://code.google.com/p/android/issues/detail?id=5669
Yes, that’s right. People have been losing data on their handsets for more than a year and Google still hasn’t replied, or to my knowledge even acknowledged that there is an issue.
When Apple launched their iPhone 4, the entire world kicked up a fuss surrounding a small issue where the phone had signal problem in certain cases. That, in my book isn’t an issue compared to data loss. However, the peeps at Apple were forced to respond with advice on how to fix the issue as it was a media frenzy. Now my question is.... WHERE IS THE MEDIA FRENZY on this one? Why is Google allowed to get away with not listening to their customers.
You know what. Everyone out there, if you are unsure on what the next mobile phone you would like to buy, BUY AN iPhone. Don’t buy an Android device. Google doesn’t deserve your support as clearly we, the consumers and users of their operating systems isn’t important enough for them to care. It’s a sad day, as I love my HTC Desire HD, but in reality, until they fix the issue surrounding data loss, I cannot trust it with ANY of my data at this point in time.
It’s been a while since I last sat down to write a blog post. I have to admit, 2010 was not exactly my best year on the bloggers sphere. I do have my reasons that will explain why, but that conversation is outside the scope of this post.
As many would know by now, in the past month I’ve been able to achieve both the VMware VCAP-DCA and the VCAP-DCD certifications. I only have one more hurdle to clear in order to obtain VCDX certification, and that will undoubtedly be the hardest part. That is, I need to successfully defend a vSphere design that I’ve done before a VCDX panel of gurus.
As for the current certification path for VMware (until the next major release) it would be safe to say that with the exception of VCDX, I have come to the end of the technical certification road. It is one of the reasons that I have decided to destroy my very trustworthy vSphere lab that has brought me through a few vSphere Beta programs and has played a major role in achieving VCP and VCAP certifications. Yes, indeed, I will be replacing the lab (for now at least) with Hyper-V.
There is more than one reason for my decision to replace my vSphere lab with Hyper-V, and I will try my best to describe in detail how I have come to this decision.
VMware for some years now have been the de facto standard in x86/64 virtualisation. That is why so many IT professionals have fallen in love with their products. Still to this very day, the VMware vSphere’s feature set and in most cases, performance is unbeaten by any other product. Yes, I probably have some Citrix and Microsoft evangelists’ blood boiling by saying that, but hey, the truth hurts. Yes sure, Hyper-V has come a long way and has improved immensely during the last two years, however, it still cannot match the feature set of vSphere. How can I say this? Well, it’s not me, its Microsoft themselves.
The following question was asked to a Microsoft sales person: “How do you approach a customer who has the intention of buying vSphere? How do you fight your corner? Do you compare features?” The Microsoft sales person’s answer to that question was: “Never that. We never compare features!”
I get the feeling Microsoft is not trying to steal VMware’s customers, at least not for now. Instead they seem to have opted to go for customers where they know will be an easy sell. Maybe someone that already has a Data centre license where the initial outlay of licensing costs is minimized compared to vSphere. I mean, after all VMware is quite expensive.
“So, why are you swapping your lab to Hyper-V if they can’t match the features?” You might be asking. Well it’s simple. I don’t know enough about the product and its capabilities and the best way to find out is to play with it. No, seriously, I’m aiming for Hyper-V certification before I depart for California at the end of March.
Many people might ask why bother? Why bother learning Hyper-V if you know much about the better product? The answer to that is maybe not as simple, but let me try to explain.
Hyper-V will sell. That is a fact. There is no point in fighting it, it will sell. The Microsoft badge will sell the product. It always has sold many products, and it will continue to do so. We have seen this before. Let’s use directory services as an example. Anyone remember NDS (Novell Directory Services), or more recently eDirectory? In my opinion (and sorry if I offend Microsoft evangelists here, but I’m entitled to my opinion, so I don’t really care) NDS / eDirectory was and still is a much better directory service than Microsoft Active Directory (AD). You see, the thing is, even though many of us have installed eDirectory in our environments, and client environments, the majority of infrastructures today are based on Active Directory. Many will disagree, but I still firmly believe that the old NDS / eDirectory will crush AD with regards to functionality, ease of use, performance and stability but sadly today, the product is nowhere to be found. Even though NDS was/is better, AD has killed it off and today Novell is all but dead! Microsoft just knows how to run a company! It doesn’t matter how good the other products are, the Microsoft badge will still help them sell their products. We can look at other products as well. What about WordPerfect? Better than Microsoft Word? Hell yeah. Guess what, WordPerfect is nowhere to be found and it used to be everywhere.
I know, I’ve hit a sensitive point here, and I’m sure VMware and (past) Novell evangelists (I am a VMware evangelist and certainly used to be a Novell one) are cursing at their monitors now for reading this, but just think for a minute. At the end of the day VMware, Microsoft, Citrix, these guys exist for one reason and one reason only, profit. I have realised over the past week or so that it doesn’t matter to my finances which one is better. I can be a top technical person in VMware products today and maybe next year and a few years thereafter, but it would be foolish to not want to learn about Hyper-V. Ok, fair enough. I can’t see what the future holds, but I can learn from the past, and the past shows me that all my friends who kicked against AD in favour eDirectory are today lagging behind and struggling to find contracts. These guys were Master CNEs. Today they don’t have much to boast about.
Who knows, maybe VMware will still somehow be going as strong s they are today in 20 years or so (heck with their pricing strategy, I’m not sure how they’ll fight off M$), but for today, I’ll be covering my basis and laying some foundations with some knowledge of Hyper-V. At least Microsoft gives us TechNet Plus, something VMware doesn’t seem to understand either. Talk about stingy! WE NEED NFR LICENSES VMWARE, but that’s a discussion for another day.
Don’t get me wrong on this. I love vSphere. I love vSphere more than any other IT product / technology out there today and I will still try and sell vSphere over Hyper-V, just because it’s a much better product. It’s going to break my heart to destroy my beloved vSphere lab, but it’s got to be done.
I believe it’s time to stop kicking and start working with VMware and Hyper-V.
The views expressed in this post and on this entire website are that of my own and not those of my employer or any vendor. I am solely responsible for the contents on this website.
Yesterday, December 9, 2010 at 10:45am I took the VMware VCAP-DCA exam thinking that I would probably be able to finish it off in about 1 and a half hours or so. Wrong.
I booked this exam quite a long time ago. If I remember correctly, I got a call from VMware around October 8, 2010 asking me for a preferred date. I think I said “Late November please”. Anyway, the exam was originally booked for 29 November, but thanks to the London Underground workers taking an unpaid day off work (strike), I thought it would be wise to reschedule.
I thought that I would at least have time to study and play in my lab before the exam as I had 2 months! Nope, I had no time. Even in two months…No time to study! Ok, well I had the odd day here and there, but I don’t think I actually learnt anything during those days. In fact I think I’ve managed to study more for the VCE310, and believe me, although I passed that exam, it still was an “unprepared” attempt. I did get through the first few pages of the VCAP-DCA exam blueprint though. In the end I guess I thought that my experience with VMware products should [hopefully] be enough!
So, on the day of the exam, I arrived at Holborn Underground station at 10:10am. The Pearson Professional test Centre is a short walk from Holborn station. However I was waiting at the traffic lights to cross Kingsway road when the police stopped all traffic in all directions. After about 30 seconds the queen pulled up right in front of me! Way to go!!! I’ve never seen the queen in person before! She was about 3 meters away from where I was standing. Cool!
Anyway, back to the VCAP-DCA. Arriving at Pearson Vue, the standard procedure is followed. You present two forms of ID (I presented my passport and UK drivers license), sign an electronic pad and, get our picture taken! Then you place all personal belongings (wallet, phone, keys, wristwatch, etc. in a locker provided). You should only take the locker key and your passport into the exam room.
I was then told to go to room 3, a room where I’ve spent many hours before! The lady at room 3 took a full 10 minutes to get me signed onto a workstation as her PC sort of crashed. I could tell she got really worked up after a while. But I was relaxed and really couldn’t be bothered to get grumpy with the slow sign on as… I’ve just seen the queen!!!
I finally got signed into workstation 32. The good old VMware Certification Survey thingy appeared. I hurried through that as I couldn’t wait to start [and finish] my exam. And there it was! This biggest surprise of the day [apart from seeing the queen]! No multiple choice questions. That’s right, not a single one! The entire exam is one big lab session, so all you Test King lovers (or in other words, cheaters), please browse away from Test King if you are intending to pass this exam.
Once you click start, the lab session launches to an RDP session somewhere (I guess probably at VMware HQ). All you get in the RDP session is a Putty Client, the vSphere Client, Adobe Reader (yes, that’s right, however I’m not sure I’m allowed to say what it’s for), and I’m sure there was something else, but I cannot remember what it was! I guess I didn’t need it after all.
At the top of the screen you’ll find a bar that allows you to switch between the actual lab and the lab questions. A full list of usernames and passwords for the different systems in the lab is provided at the bottom of each question, so there’s no need to take them down before you start.
I have to say that the VCAP-DCA lab environment has been improved over what was provided as a lab environment with the VCE310. I didn’t have any graphical glitches, although the session only ran at 1024x768 which was problematic at times. Also, don’t bother using the Adobe reader provided as scrolling through pages are extremely slow and will only cost you valuable time.
The exam was about 3 and a half hours and there were 35 questions / scenarios. There is no link between the question that you are on and the component you are using in the lab. It is a live lab and any changes made in question 1 will remain in place throughout the rest of the exam.
Also, some lab questions will result in you configuring specific components in the lab, which you’ll find will need to be in place as a prerequisite to successfully complete other questions later on in the exam. For instance, one question might ask you to successfully configure a DRS cluster, and then maybe 20 questions later you will be asked to configure some resource pools. Resource pools cannot exist without DRS. This might be a simple example, but believe me the real exam will require complex configurations that depend on each other to be correctly configured.
Some questions were complex and tricky to complete. Others were way too easy in my opinion. There are questions that you will never be able to answer unless you have extensive experience in enterprise size environment and FC storage, unless you are some kind of freak that knows every line of every PDF ever written on enterprise storage and virtualisation.
At the end of the exam you are told that you will receive your results within 10 business days [I’d like to see that happen first before I believe it though]
Overall, I’d say it’s probably the most fun I’ve had writing an exam [if the words fun and exam are allowed to be used in the same sentence]. I think it was well thought out and well presented. It will certainly separate the men from the boys in regards to experience and definitely separate the real IT professionals from the book crammers, which is a very good thing.
On 06 November 2010, VirtualVCP.COM was migrated to a shiny new Web Server. The site was picked up and moved with all data [hopefully] intact. However, if you come across any dead links or a broken RSS feed, I'd be grateful if you can let me know via the contact form.
I have recieved an email from Stratogen informing me that they are looking for experienced users to join their vCloud Beta Program, which is based on VMware vCloud Director. Their message is below:
StratoGen are seeking experienced VMware users to join the StratoGen vCloud Beta Program which is based on VMware vCloud Director.
Beta testing is a crucial element in the cycle of our product releases, and we work closely with the VMware community to ensure our products are the best they can be. vCloud Director is a powerful but complex product and as such we are seeking experienced users to provide informed feedback on our product offering.
By participating in the program you will be provided with resources on our enterprise platform enabling you to build, deploy and manage virtual machines, vApps and networks using the StratoGen vCloud Director portal. You will be contacted on a periodic basis for feedback.
StratoGen is a leading VMware Service Provider Partner (VSPP) with an extensive cloud hosting platform based in London, UK.
If you would like to take part in the program please register at http://www.stratogen.net/products/vmware-hosting-vcloud.html
In VMware ESX you can use the vimsh command from the command line in order to retrieve CDP information. With the release of ESXi 4.1, the vimsh command is not included. However there is still a way to retrieve CDP informtation via the CLI. Instead of using vimsh, you simply use vim-cmd. The path to the utility is /bin/vim-cmd.
I have to write this down as I'm sure I'll come across this issue again in the future. Today I had to install ESXi 4.1 on a ProLiant BL465C G7 blade server. This turned out to be problematic to say the least. It turns our that the native ISO imaged downloaded from the VMware website does not include all of the drivers required for ProLiant G7 blades. Anyway, as I had a very busy day, I really didn't have time to try and figure this one out for myself. Lucky for me, Steve [Bryen] has done this already and posted the workaround on his blog:
So I've not been blogging (or twiiting) for a very long time. To be honost, I've just been so busy over the past couple of months, I just didn't have any time. I know it's bad, but hey, I'm learning every day and soon I'll be typing away on some technical blog articles again.
Anyway, Mike Laverick over at RTFM Education has launched his VMworld 2010 SwagBag Xmas Raffle. There are loads of goodies included in this raffle, so I would advise you point your browser at http://www.rtfm-ed.co.uk/2010/10/20/the-rtfmvmworld-2010-swagbag-xmas-raffle to get involved!
Hopefully I'll be back later with some techie stuff again.