Customize Disk Partitions in MDT

For most systems, I typically recommend using the primary disk’s full capacity as one partition, C:\, instead of creating multiple partitions/drive letters for end users. As an IT Pro, it makes it easier for me to find someone’s “stuff” if they store their data in a standard location like their default profile location, C:\Users\%username%\.  If all of your documents, pictures, shortcuts, Favorites, settings, etc. all live in the same place, I don’t have to go hunting for files when it’s time to migrate someone to a new machine.  (Or, better yet, I can automate it!)  For the end user, it’s just easier:  Most people are used to just saving files to the default locations on their home computers.  Any time you can keep the corporate computing experience similar to what people experience at home, it saves you time and money.

However, there are some times when it can be advantageous to create more than one partition when deploying an operating system (OS) to a computer.  I know quite a few people who actually prefer that their end users store their data on D:\ so that it can be fully separated from OS and applications on C:\.  In the event of an OS crash or malware infection that isn’t recoverable, C:\ can be wiped out and all of the user’s data on D:\ is still there.  Personally, I’m not a huge fan of that because it tends to miss application settings, the Registry hive, and other important things a user may miss later.  But, to each his own I guess.

I am, however, a fan of separating data from OS and software on servers.  I’m also a fan of keeping my virtual machines totally separate from C:\ also. (Those things have this bad habit of filling up disks, don’t they!?!)

How MDT Partitions Disks

The disk partitioning process is a task that is part of each OS deployment Task Sequence.  By default, MDT creates a C:\ partition using the full first disk and names it OSDisk.  If this default doesn’t work for your environment, it is pretty easy to change.

Change the Default Partition

In the MDT Deployment Workbench, go to Deployment Shares > $YourDeploymentShare > Task Sequences.  Find the Task Sequence you want to edit and right-click on it.  Click on Properties.


In the Task Sequence Properties, go to Preinstall > New Computer only > Format and Partition Disk.

01-custom_disk_partition_mdtIn the Volume section, you should see “OSDisk (Primary).”  Click on OSDisk (Primary) and then click the Edit button.  (The Edit button is the middle button that looks like a hand pointed at a document with a bulleted list.)

02-custom_disk_partition_mdtIn the Partition Properties, you can change the Partition name, the size, file system, etc.

03-custom_disk_partition_mdtFor our example, we’ll change the partition size to “Use specific size” and set it to 80 GB.  Once we’re done, click Ok.

I don’t want to waste the remaining disk space; so, we’ll add a second partition that uses the remaining space.  Back in the “Format and Partition Disk” task, click on the New button.  (The New button is the left-most button that looks like a yellow star.)

04-custom_disk_partition_mdtIn the Partition Properties, fill in the Partition name with “Data Disk,” and select the “Use a percentage of remaining free space.”  Set the Size (%) to 100.  Ensure the File system is set to NTFS and click Ok.

05-custom_disk_partition_mdtWhen you’re done, you should have something that looks like this:

06-custom_disk_partition_mdtIf we perform a test deployment, you should get an 80GB drive and a second with the remaining space.


What kind of reference image should I use and what should be in it?

I had a great question come in last week and the writer agreed to let me respond as an article:


Last July, I started my first real systems administrator job at a school system here in the Midwest. One of the things I inherited was Ghost for imaging computers in classrooms, computer labs and so on. Now that Symantec is killing off Ghost, I’ve been tasked with figuring out how we’re going to re-image computers this summer. We’ve settled on using SCCM for our OS deployments, but I had a question about reference images after reading your series on creating base images in MDT. What do you typically include in your reference images? Our Ghost images include literally everything from Office to Java to other random education apps… just about all of them.  We even found an image with some old gradebook software in it. The gradebook software went fully web-based years ago (before I even got here) and the software just never got taken out! The problem is that it feels like we’re constantly updating the reference image (all 40-something of them!!!!), people have apps they don’t need, many of the apps like Java and Flash have to be updated immediately after a re-image, there are remnants of old software, and so on.

Any help or advice you can provide would be really helpful!

Jeremy S.


First off, thanks for letting me answer your question in the form of a blog post!  And, thank you for responding to my followup questions so crazy fast.  Here we go:

I too came from the school of Ghost imaging; so, I totally understand where you’re coming from. A lot of people that use sector-based imaging solutions build these massive monolithic catch-all images and tend to update them for years on end before re-creating them from scratch (or they just keep using the same base forever!).  And for good reason… you tended to have to have a whole lot of them to cover all of your hardware types and use cases.  The good news is that when Vista came out, the whole OS deployment process got an overhaul and it made OS deployment far more customizable and predictable without the need to create these massive reference images (unless your particular environment requires it).

MDT and SCCM have really changed the game for OS deployments.  You don’t need to create a monolithic reference image that includes every single piece of software someone needs if you don’t want to.  You can install as much or as little as your want and then use MDT or SCCM to customize that deployment at install time.  So before we can really get into a discussion about that what of your reference image, you’ll need to decide what kind of reference image you’re going to create first.

There are three schools of thought when it comes to creating reference images:  Thin Images, Thick Images, and a Hybrid Images that are somewhere between Thin and Thick.

[Short Answer] Which do I recommend?  Honestly, it depends on your environment and what you’re trying to accomplish.  If you just need to test something like a script where you don’t need any applications or to be fully patched, a Thin Image is probably all you need.  If you’re imaging a computer lab full of computers that all need to be identical, then you probably need a Thick Image. Most people I know (including me) are using a Hybrid Image.  I use a Hybrid Image because the applications used by my end users vary and I like to be able to customize the deployment to their specific needs.

[Long Answer] —

Thin Images

For me, a Thin Image is OS only.  I’ve seen some people use just the RTM media to deploy Windows 7/8 and then lay down all their software, but there’s one huge problem with doing it that way…  If you use the RTM bits, you now have to install all of the Windows Updates too.  Ouch.  That can be really time consuming.  Personally, I like to keep a Windows  reference image that is using our currently supported version of Internet Explorer and the latest Windows Updates installed available as a Thin Image with no other 3rd party software.  Even if I don’t update it every single month, I’m not having to wait while over a year’s worth of updates are installed on the computer.  There’s also the added benefit of speeding up the process of building a Thick/Hybrid Image if I base it off my fully patched Windows 7/8 Thin Image.


  • Smaller image since since you’re just dealing with the base OS (and possibly Windows Updates).
  • Very customizable since there isn’t any software installed.
  • Speedy install of a base OS (assuming you’re including Windows Updates).


  • Requires months (if not years worth) of Windows Updates if you don’t make a reference image that has the latest updates.
  • The full deployment process of laying down the OS and installing all your software on a computer may be slower since you’ll have to potentially install Office, Adobe products, plugins, etc.
  • Potentially eats up additional CPU cycles and disk IOPS in a virtualized environment while software installs.


  • Any time you just need Windows on a system… whether that be testing or systems that don’t require additional software.
  • When you need to customize the install of each and every computer that will be deployed.


  • Windows Base OS
  • Latest version of IE your applications support
  • Latest Windows Updates
  • [Consider] Visual C++ Runtimes

Thick Images

A Thick Image is everything and the kitchen sink (ok, well maybe not the kitchen sink…):  Windows, Office, all the latest Windows/Office Updates, plugins, custom apps, and everything else you can think to install.


  • PC is ready faster since all necessary software is installed as part of the image.
  • Works well as a “cookie cutter” deployment to large numbers of identical systems like in computer labs or corporate environments where every PC should be identical.
  • Easier to hand to junior level staff or temps since everything is already installed.
  • Less chance for a piece of software to be missed at deploy time since everything intended for the system is already in the image.


  • May require more frequent updates since you’ll need to update it monthly for Patch Tuesday updates from Microsoft and third-party products.
  • May require patching after image is deployed since third-party products like Adobe Reader, Adobe Flash, Oracle Java, etc. may have been updated since the image was built.
  • May require building multiple reference images since software needs may differ between different departments, computer labs, etc.
  • An error like a misconfiguration or a piece of software that wasn’t installed in a Thick Image means the error goes out to more computers.
  • Users end up with software that they potentially don’t need.  Unneeded software will still need to be patched/updated even if the users doesn’t use it.


  • Computer labs where a room full of systems will all be identical.
  • Server deployments where all the systems will be identical.
  • Large scale deployments where all the systems will be identical (see a trend here?).
  • Time sensitive deployment when you need to deploy the OS and all software as quickly as possible to a system.


  • Windows Base OS & EVERYTHING else
  • Latest version of IE your applications support
  • Latest Windows Updates
  • Visual C++ Runtimes
  • Office (and latest updates)
  • Browser Plugins (Flash, Java, etc.)
  • Adobe Reader/Acrobat
  • Antivirus software
  • Management agents
  • VPN Client

Hybrid Images

A Hybrid Image is somewhere between a Thin and a Thick Image.  It would typically include applications that everyone gets that [hopefully] aren’t updated constantly like Office, Visual C++ runtimes, various agents, OS customizations like adding wallpapers, etc.


  • Smaller images than Thick Images since unnecessary software isn’t installed.
  • More customizable as unnecessary applications aren’t installed and the image can be customized to the needs of the user of the system at deploy time.
  • Sped up deployment since larger common packages like Office and Windows/Office Updates are already installed.


  • Still may require updates after deployment if the image isn’t updated regularly.
  • Slightly slower deployment if large packages are left out of image and need to be installed as part of the OS deployment process.


  • You have applications that all users get (like Office for example), but you still want the ability to customize the experience for each department or user.
  • You don’t want to constantly update images to update things like Java and Flash.


  • Windows Base OS
  • Latest version of IE your applications support
  • Latest Windows Updates
  • Office (and latest updates)
  • Visual C++ Runtimes
  • Management agents
  • Antivirus Software
  • Install everything else at OS deployment time

Customize IT Organization Using Variables in MDT

As I covered previously, you can customize the CustomSettings.ini in the Microsoft Deployment Toolkit (MDT) to show a custom message like the name of your IT department or company/organization when an OS deployment is running.  You can take this customization a step further by using variables within the MDT environment to customize the message further.

Continue reading

Create a [Mostly] Automated Reference Image in MDT – Part 5: Pause/Suspend the Task Sequence

There may be times when you, for one reason or another, have to perform a manual step as part of creating a reference image.  This could be anything from installing a finicky or old piece of software that doesn’t have an unattended installer, making manual changes, or anything else that for whatever reason can’t be automated.  When this happens, you need to temporarily pause or suspend the Task Sequence so that you can perform whatever manual steps are needed.  So, how do you do that?

Continue reading