Friday, May 9, 2008

Moving OU structures out of the test network

Designing an Active Directory in a test environment is a good thing. However, doing so requires you to think through some extra steps. When the design is complete, then what? How do you move the setup from the test system into the live environment? One option would be to turn your test environment into your live environment. Another would be to do a backup and restore of Active Directory. The safest way is to manually recreate each item in the soon-to-be-live system. To me, these methods seem clunky and sometimes dangerous. I like clean and simple (why do I have to do all of the work instead of the computer?).

I ran across a situation like this recently. Over the course of several meetings we developed the “perfect” OU structure for our new AD. The old structure was not configured properly (no, OUs are not just folders and they aren’t the same as security groups). Once we got the new structure set up on a VM, I had a dilemma. I decided there must be an easier way. Enter PowerShell.
I used the PowerShell AD cmdlets from Quest (have I said how awesome these tools are recently?) to export all of my OUs from the test network into a CSV file. Since I like clean, I only took what I needed which was the DN and the Name. Here is the code:

get-qadobject -type 'organizationalUnit' select 'dn','name' export-csv 'c:\NewOUs.csv'


From there, I had a CSV that listed every OU my new domain would need. You would think I could just import the file into PowerShell and loop through it, but noooo. That would be too easy. You see, we need to know what the distinguished name is for the parent container. Some of you may be saying that I forgot to include that in my original query. There is a property called ParentContainer that you can get for an OU. It sounds good until you try to use it. That is when everything breaks. It seems that the ParentContainer property stores the data in path format (domain name\ou\ou\etc.) not Distinguished Name format. Not to worry, I had a plan. See if you can figure it out:

import-csv 'c:\ NewOUs.csv' foreach-object {new-qadobject -parentcontainer $_.dn.substring($_.dn.indexof(",")+1) -type 'organizationalUnit' -NamingProperty 'ou' -name $_.name}


What I did was I took the DN of the OU (which includes the OU itself in the name), and stripped off the first OU (which would be…the OU itself). I found where the first comma was then did a substring of the DN starting from the first character after the comma through the end of the string.

Every OU now populates into the fresh new domain controller. If you run this in a test environment (please don’t do it in live until you do – if you mess something up, don’t mention my name), you may see some errors as the script runs (if it is a fresh AD you are importing the structure to, you should see one error only). These errors can be ignored. What is happening is the system is detecting duplicate OUs and it errors out instead of overwriting an OU. That is a good thing. In a fresh system, the duplicate OU it finds is the “Domain Controllers” OU that ships by default.

Now that these commands have been set up properly, you will notice the added benefits they provide. What if you wanted to play around with changing your OU structure? No problem. Just export your current structure and import it into a test environment. Combine this with a user and group export through PowerShell and you could move over your entire AD easily without bringing over the baggage that sometimes comes with a backup and restore.

Monday, April 28, 2008

Automating Active Directory User Creation

I have been learning to use Powershell over the past few months. Every script I write, I write in Powershell so I can improve my skills. It has become aparent to me that the name is very appropriate. With very simple commands that even a (gasp) non-programmer can do, you can automate just about anything. Today, I automated a way to create users in Active Directory. The script (if I can call it that since it could all be done from one line) pulls in a CSV file that has been prepopulated with users, assigns the fields to AD attributes, and creates the users. It can put them in different OUs, assign an expiration date, and much more.

I did not tackle every user property, just what I thought was important for our organization. I'll probably even add a couple fields for us. The point, however, is that this is a more robust and fully-functional example of how to add multiple users to the domain easily. From here you could set up a script to pull information from your HR system and either dump it into the CSV file for processing or have it run the import directly.

Here is a link to the script.

Friday, April 25, 2008

Google Apps Rock

I've been debating what I should do with my Exchange 2003 server. Because I run a college, I actually end up with a fairly decent size email server (over 1500 mailboxes) even though I have a small staff. When I look at everything I need to pay for each year (spam protection, antivirus, etc.) per mailbox, I cringe. The bill is higher than I can really afford. I also cannot offer the advanced services our students really want (lifetime email, larger inboxes, etc.). As a result, students don't use their mailboxes for much and yet I am stuck paying for services for them.

Enter Google Apps. Google allows me to, using my own domain name, host as many mailboxes on their servers using the GMail platform as I would like. Besides being able to give the students 6GB+ of email space and the ability to keep the address for life, they can also take advantage of most of the GMail features like document collaberation, IM, sites, calendars, and very good spam protection. The best part is that this is all FREE! I am actually going to move our staff and faculty over to the platform as well and dump my Exchange server all together. I can upgrade our staff to 25 GB of storage and email archiving for compliance purposes. Because we fall under the education umbrella, we also are allowed to hook up single sign-on capabilities directly to our Active Directory. All in all, Google has hit a homerun here with a quality product at a price I can afford.

Tuesday, April 22, 2008

Documentation That Works - Part 1

I've read all of the advice articles about documentation. I know all of the reasons to document. I've benefited from good documentation and I've suffered through having poor (or no) documentation. Yet, even after all of this, I find it very difficult to create good documentation. If I create documentation in a Word doc or a help file, I have to remember to open the file up and make the necessary changes every time I do something. Usually I do well at this until an emergency comes up. Then I make the changes to the failing system, but forget in the heat of the moment to make the documentation update. Next thing I know, the documentation is all out of date and I don't know what is good information and what is bad. This gets compounded when I am not the only one forgetting to document.

I have decided that a complete system overhaul is needed. The current system has too many flaws (most of them human in nature). The best solution I have found to replace our current one is a wiki. A wiki is designed to be flexible, have change management, and it is accessable by any of my employees from any computer. This brings the documentation closer to where they are. We chose to start using DekiWiki by Mindtouch. We actually use their free hosting, at least for now, so that we can test out the functionality before we commit to it.

This solves part of the problem. The documentation platform is now closer and a little more accessable to the person making the changes. It also gives us the ability to share out parts of this documentation to a larger audience if we so desire. Finally, the change management allows me to review what has been changed to be sure that the documentation reflects the change made. However, this hasn't solved all of the problem. It hasn't made it any easier to remember to document our changes. It also hasn't made the documentation itself any easier.

This is where my next installments will pick up. Part two will address how a script could be used to monitor changes to key systems and prompt for documentation of those changes. Part three will address the program I am working on developing to automatically create documentation (of my computers using the XML output of my computer inventory script, my Active Directory using some Powershell tools, and other systems). The program won't be complete until it can upload the information right into the Wiki, creating more documentation instantly than we could in a month.

Sunday, April 20, 2008

Network Computer Inventory Script - Rough Draft

OK, so who hasn't seen a script or program that takes an inventory of your local computer (Belarc Advisor is a good example) or an entire network of computers (Newt works well)? Well, I like both of these options, but I wanted more control (and less cost) over the inventory. I did a little searching for a solution that utilized Powershell. I came across a solution from Peter Stone. I took his script, ripped it apart, and put it back together a little differently. His solution was designed to output the information to a html file and to the screen for the local computer only. It utilized WMI, which was perfect for my purposes.

My goal was to create a script that scanned a list of computers from one central computer, get a full inventory of computer features, and store them in an easy to access format for later use. I ended up using a free snap-in from Quest for accessing my Active Directory to get a computer list (I didn't feel like doing a recursive ADSI query when the tool from Quest was so easy to use).

The way this script works is that the AD is scanned for a complete list of computers. Each computer is pinged to see if it is online. If it is, a complete WMI scan is done to gather all of the values desired. The entire scan is written to an XML file with the computer's name as the file name.

This script is in (very) rough draft format. It works, it does the job fairly well, and it doesn't have any major bugs. However, the code isn't clean, some attributes are missing (like CD-ROM information), and some things could be done better (capture a list of computers that were missed for the next scan, etc.). I also am starting to develop a wish list of other ideas that I could do with this script (optional scanning of categories, optionally add all of the computers to one XML file instead of one per computer, etc.).

I am posting this script now so you can see it and dream with me. I will post the final version sometime next month, hopefully. Here it is.

Monday, March 10, 2008

Winter Scripting Games

This year I tried something new this year. I entered the Winter Scripting Games. I began by entering the VBScript contests (beginner and advanced) since I have had quite a bit of experience with VBScript. Within the first couple days, I had completed most of the contests. I decided to have some real fun with the competition. I didn't know Powershell or Perl when the contest started, so I challenged myself to learn Powershell and complete both the beginner and advanced contests for that scripting language before the deadlines. It was close a couple times, but I did it. I got a perfect score in all four categories (and the promise of a Dr. Scripto bobblehead to come).

The competition was a blast and I highly recommend it to any person who like to or wants to script. Check out the site (and the nice list of prizes). The Scriping Guys do a great job every year.

Friday, February 15, 2008

What this is all about

I guess I could call this "Confessions of a lazy blogger". My goal with this blog is to share what I am doing and what I have done with my computer systems. I have gotten quite a bit of help from the community over the years and I would like to do my part to give something back. I'll give you a little backstory about me.

For six years, I was a networking and programming consultant. I spent all of my time designing networks, automated systems, and programs to help companies with their initiatives. In 2004 I decided it was time for a change. I took a job at a small college where I am now, first as a programmer and network administrator and soon after as the Director of IT. I have been able to slow down the pace and look at a larger picture of our organization. Because we are a small institution, budget is always an issue (ok, when isn't it an issue?). We have some of the same needs of larger organizations, but with less training and resources to get it done. I have found that I have a real passion for providing smart solutions for organizations of our size.

Over the past few years, I have developed scripts and other methods for better utilizing the products we already have. This blog is a place to share those ideas. Maybe you can benefit from what I have put together or maybe you could add to it.