Tuesday 25 December 2012

Sci-Fi Alien Weapons

As usual with modelling it has taken me a about three weekends to create one model.

I drew the concept on the 10th of December and put the final touches to the finished weapon and added it to the game on Christmas Eve.



It's inspired by, in fact deliberately similar to, the alien style weapons pack I purchased from Garage Games



That set only includes 3 models and I needed an additional larger support weapon for the aliens.


I am getting the hang of 3D modelling and the basic untextured model only took a day.  What takes all the time is the UV unwrapping and then the texturing.

I always start by creating the Ambient Occlusion layer for my texture.



It does not help that I have to do some bits twice because I forget to set some parts of the model as smooth before creating the Ambient Occlusion (AO).


Without using the smooth option on parts of the model that are curved, the AO shading is banded instead of continuous.



I always like the model with the AO texture.  Shame I can't leave it that way.  I have seen a game where that was the style of the game.  It looked very good but not what I want for Diabolical: The Shooter.

If my artistic skills were better I would be able to create the texture quicker.  I have to put most of it togther from parts.  More like contruction than drawing.  I go through each part, as split by the UV unwrapping, and add details as separate components and blended in to the base colour.  Layer after layer after layer.



I am starting to learn some of the techniques but it takes me time.

For some reason it ended up grey but for this model I wanted it a more silver colour to match the others.  I used a bit of brightness but mainly I adjusted the gamma correction curve to get something more silver.



The in game lighting adds a bit more shine to the surfaces.



Merry Christmas to all.  25 Dec. 2012.

Thursday 20 December 2012

Winform Map Editor

Last night I finished my side project of migrating the map editor from being a game screen to being a normal windows application (Winform.)  I've been doing this when I needed a break from 3D modelling. 


At the moment all I have attempted to do is move all the features I had before in to a separate project.  I did find a few minor bug fixes but it is basically the same.

The only major change was to the controls for moving about the map.  It now works more like a 3D modelling programme, in fact the rotate, zoom and move are very similar to Blender.  I have however retained first person keyboard controls for fine adjustments.

I am doing most of the modelling in Blender so the new controls are much more familiar when I bring the models in to the editor.

I have left space to the side and bottom of the main view window to put properties.  I have not coded anything to go there yet but in time I will add something useful.

The editor is still linked to the main game.  It shares some of the rendering code and all of the map loading code.  I compile the editor project in the same solution as the main game.  This means that the map files will always remain compatible.

I find it works well.  The only peculiarity I had was that the game view control from the Microsoft sample is external.  Exceptions thrown in the game view crash the control but the rest of the Winform app continues without error.  The view window goes white with a red cross in it!

I fixed that with a simple try and catch wrapped round my update and draw code:


/// 
/// Redraws the control in response 
/// to a WinForms paint message.
/// 
protected override void OnPaint(PaintEventArgs e)
{
    try
    {
        string beginDrawError = BeginDraw();

        if (string.IsNullOrEmpty(beginDrawError))
        {
            // Slow the game down to no more 
            // than 60 fps (16.667ms per frame)
            if (gameTime.ElapsedUpdateTime > 
                TimeSpan.FromMilliseconds(16))
            {
                // My own version of GameTime
                // based on a stopwatch.
                gameTime.Update();
                // Simulate the update loop in 
                // an XNA game.
                Update(gameTime);
                // Draw the control using the 
                // GraphicsDevice.
                Draw(gameTime);
                EndDraw();
            }
        }
        else
        {
            // If BeginDraw failed, show an error 
            // message using System.Drawing.
            PaintUsingSystemDrawing(e.Graphics, 
                                    beginDrawError);
        }
    }
    catch (Exception ex)
    {
        // Exceptions within external applications do 
        // not automatically break the parent form.
        // These lines trap the exception, display the 
        // error and break the parent code for debugging.
        System.Diagnostics.Debug.WriteLine(
            "Exception in Update or Draw: " + 
            ex.Message + " in " + ex.Source);
        System.Diagnostics.Debug.WriteLine(
            "Stack trace: " + ex.StackTrace);
        throw new Exception(ex.Message);
    }
}

The whole app now crashes and the exception is displayed to help debugging.

In addition to being able to position, models, triggers, waypoints and generate the navigation mesh used by Bots, I have loads of helper overlays within the editor.  Too many to list:


I nearly forgot.  There was a point to moving the editor out of the main game. 

Every time I tried to change the game I had to keep it compatible with the editor.  There was often lots of extra code just so the game remained fast and the editor had features.

When I started looking at adding in the networking code it just got too complicated to do what I wanted and retain compatibility.  That's what kicked off this little side project.

It's how I should have done it in the first place.

Sunday 9 December 2012

One Line Of Code

I am sure that everyone has these days when coding.  This is just a quick note to remind people to keep on going no matter how bleak things appear!

I had been working away on the new weapons and effects and got round to testing everything on the Xbox.  I do this from time to time because the Xbox does not perform in the same way as the PC does.  Much to my surprise the performance was a disaster on the Xbox!

That was late yesterday and I was going out in the evening so I left it wondering how long it would take to find whatever it was I had done to cause a frame rate of just 9 frames per second!  It was perfect on the PC but the Xbox was unplayable!

I had thoughts of the level being too complicated and having to come up with some clever code to cull more efficiently, or that the texture files were too large, so I would have to reduce the quality.  I had quickly ruled out the particle effects because although they were pretty the code, quantity and textures sizes were unchanged to effects I had been using for years and the performance graphs did not change when they were on the screen.



I've been working in I.T. long enough to know that whatever you last changed is the most likely cause for whatever problem you are now investigating.  This morning I therefore thought through all the changes since the last time I had successfully tested on the Xbox...

I thought it was a fairly easy non-performance affecting change but I have completely re-written the Audio Manager to change from using XACT to using the SoundEffect classes.

Some quick commenting out and I very rapidly confirmed that somewhere in my conversion I had made a mistake.  To cut an hours searching down I found it.

Just one very short line of code:
nextSong = "";

I had an endless loop that would start playing the next waiting piece of background music every frame because I had not cleared the variable once it started playing!

I am a lot happier now and I can go back to creating assets and effects for the game.

Sunday 2 December 2012

New Weapons New Effects

After having completed modelling the new alien sci-fi weapons and getting them in to the game I wanted to complete them with their own sounds and visual projectile effects.




I am pleased with the results.  The picture above is just one of them.  I have three weapons ready and two particle effects.  Two of the weapons share the same ammunition so the same effect.

The effects are created with particles and I use a tool I wrote for myself to help adjust the effect until I am happy with it.  I created the tool as open source so anyone can use it: https://gpuparticles.codeplex.com/

I find it takes a while to edit the 2D images used for the particles.  It's not that the particle textures need to be complicated it's just that it is difficult to visualise what they will look like in the effect until you try them.  There's a lot of back and forth getting things as I want them.

That's the texture from the other effect I have created.

Now for the sound effects.  I was able to find some public domain and creative common licenced sounds that are just right for the sci-fi sounding shots.  At the moment I have used them 'as is' but they need some minor adjustments.  The main change and my next task is to normalise all the volume levels so they are all the same volume relative to each other.

That's a job for next weekend I think.

Saturday 24 November 2012

3D Modelling Takes Time

I haven't posted for over a month.  I didn't want you all to think I had given up.

I have been working away converting models for use in the game.  The first level overall layout is complete.  It needs some aesthetic improvements, mainly on the ground textures but it is getting there.

I got to a stage where I needed to add some weapon pickups so I need the weapon models.  I may have mentioned previously that finding pre-made models in a SciFi style is quite difficult.  I found these weapons for use in the Torque 3D engine by Garage Games.



The style is ideal for Diabolical but all the weapons have to be at the correct scale and orientation and the grips have to be in the right places to line up with the animations.  There are also some bits used by the Torque 3D engine that I do not need for my game engine.



It may not sound like a long task but making those few changes takes me several hours per weapon so with only a few hours available each weekend recently it has taken me a few weeks.  The limited time available has mainly been due to playing through Halo 4 :-)

Nearly there with these weapons then I need to add the particle effects and get them in the game.

Saturday 13 October 2012

MipMaps and Textures

I've been tidying up and improving my code.  Mainly to get my level map files in to a single file and to use the content pipeline to load the maps in to the editor rather than having two sets of loading code.

In the process I have worked out how to create a texture in the content pipeline.

As I am now loading everything in the Content Pipeline I will no longer be using the code that creates MipMaps at run time.  Rather than forget how I did this, I thought I'd add the code here to remind me.

I won't go in to detail but a MipMap is simply a half size version of the main image stored in the same file. Typically several levels of ever decreasing images are stored in the same file.  They are used so that textures displayed at a distance look less pixelated.



If the above image did not use MipMaps then the distance would look grainy.

MipMaps At Run Time

This bit of code generates MipMaps at run time:


/// Create mipmaps for a texture
/// See: http://xboxforums.create.msdn.com/forums/p/60738/377862.aspx
/// Make sure this is only called from the main thread or 
/// when drawing is not already happening.
public static void GenerateMipMaps(
  GraphicsDevice graphicsDevice, 
  SpriteBatch spriteBatch, 
  ref Texture2D inOutImage)
{
  RenderTarget2D target = 
    new RenderTarget2D(
      graphicsDevice, 
      inOutImage.Width, 
      inOutImage.Height, 
      true, 
      SurfaceFormat.Color, 
      DepthFormat.None);
  graphicsDevice.SetRenderTarget(target);
  graphicsDevice.Clear(Color.Black);
  spriteBatch.Begin();
  spriteBatch.Draw(inOutImage, Vector2.Zero, Color.White);
  spriteBatch.End();
  graphicsDevice.SetRenderTarget(null);
  inOutImage.Dispose();
  inOutImage = (Texture2D)target;
}


Textures In The Pipeline

The second bit of code creates images from a byte array in a content processor using the XNA Content Pipeline:


/// Create the texture image from the 
/// Color byte array, Red, Green, Blue, Alpha.
private Texture2DContent BuildTexture(
  byte[] bytes, 
  int sizeWide, 
  string uniqueAssetName)
{
    PixelBitmapContent<Color> bitmap = 
      new PixelBitmapContent<Color>(sizeWide, sizeWide);
    bitmap.SetPixelData(bytes);

    Texture2DContent texture = new Texture2DContent();
    texture.Name = uniqueAssetName;
    texture.Identity = new ContentIdentity();
    texture.Mipmaps.Add(bitmap);
    texture.GenerateMipmaps(true);
    return texture;
}

When I was looking I found similar samples hard to find.  I hope the bits of code may be useful to others.

MipMaps In The Pipeline

The following is how I now create the mipmaps in the content pipeline while loading an image:



///
/// Load and convert an image to the desired format.
/// Returns the image suitable to use as the texture on the Landscape.
///
private Texture2DContent BuildLayerImage(
    ContentProcessorContext context, 
    string filepath)
{
    ExternalReference layerRef = 
        new ExternalReference(filepath);
    
    // Mipmaps required to avoid speckling effects and to avoid moiray patterns.
    // Premultipled alpha required so that the texture layers fade together. 
    OpaqueDataDictionary processorParams = new OpaqueDataDictionary();
    processorParams["ColorKeyColor"] = new Color(255, 0, 255, 255);
    processorParams["ColorKeyEnabled"] = false;
    processorParams["TextureFormat"] = TextureProcessorOutputFormat.DxtCompressed;
    processorParams["GenerateMipmaps"] = true;
    processorParams["ResizeToPowerOfTwo"] = false;
    processorParams["PremultiplyAlpha"] = true;

    // Texture2DContent does not have a default processor.  
    // Use TextureContent and cast to Texture2DContent
    // A processor is required to use the parameters specified.
    return (Texture2DContent)context.BuildAndLoadAsset(
            layerRef, 
            typeof(TextureProcessor).Name, 
            processorParams, 
            null);
}



That code can be called from within the processor using something similar to:


string filepath = Path.Combine(directory, filename);
Texture2DContent LayerImage = BuildLayerImage(context, filepath);


The above was worked out from the following forum and blog posts:
http://xboxforums.create.msdn.com/forums/p/44185/263136.aspx
http://blogs.msdn.com/b/shawnhar/archive/2009/09/14/texture-filtering-mipmaps.aspx

It also helps to know what the parameters are.  You can look them up from the xml file generated by the content pipeline:
Content/obj/x86/Debug/ContentPipeline.xml

The following being extracted from that:


<Importer>TextureImporter</Importer>
<Processor>TextureProcessor</Processor>
<Parameters>
    <Data Key="ColorKeyColor" Type="Framework:Color">FFFF00FF</Data>
    <Data Key="ColorKeyEnabled" Type="bool">true</Data>
    <Data Key="TextureFormat" Type="Processors:TextureProcessorOutputFormat">DxtCompressed</Data>
    <Data Key="GenerateMipmaps" Type="bool">true</Data>
    <Data Key="ResizeToPowerOfTwo" Type="bool">false</Data>
    <Data Key="PremultiplyAlpha" Type="bool">true</Data>
</Parameters>

Already a Texture in the Pipeline

If you want to chain from one processor to another or do more than one thing to a texture you can use the ContentProcessorContext.Convert method to run another processor on the same object.  In the following example the input is an existing texture loaded elsewhere in the pipeline:


TextureContent output = 
    context.Convert<TextureContent, TextureContent>(
        input, 
        typeof(TextureProcessor).Name, 
        processorParams);


If you want to just add the mipmaps to an existing TextureContent image, as shown in the bitmap example earlier in this post, you can call the following method:

input.GenerateMipmaps( true );

The pipeline code is a bit harder to understand but is probably more useful.

===

Here's a discussion thread I found recently about mipmap coding in general:
http://xboxforums.create.msdn.com/forums/t/111550.aspx
And that came from here:
http://xboxforums.create.msdn.com/forums/p/111606/667162.aspx#667162

JCB 23 Sept 2013.

Sunday 16 September 2012

Bake A Pose In Blender

Following on from my post the other day about importing in to Blender.  I need to change the bind pose to the T form so that all my animations start off from the same pose.


Repositioning the model can be done by posing the model using the armature.  The trouble is that the moment you try to edit the mesh it always reverts to the original rest pose position. 



I want to set the vertices in the posed position and for them to stay there.

I am sure I used to be able to move the model in pose mode and then apply that pose as the rest pose.  I tried that and tried searching Google but nothing worked.

I eventually found what I needed and it is very simple. 

Before doing anything save a separate copy of the file because you may want to go back and change your mind about the pose but the following removes the armature from the mesh.

Fixing the Pose

Pose the model in whatever form needed, swap back to Object mode and select the model not the armature and then simply press the button to apply the armature modifier.



Job done, the Object is now separated from the armature and remains in the pose it was last in when you edit it.


It is not always ideal to remove the armature but at least I can re-add an armature if necessary.

It's taken me most of the afternoon to work out how to do this but now I have run out of time.  Sunday dinner is ready and I am unlikely to have time to edit the pose in to the T-pose until next weekend!


Thursday 13 September 2012

Importing In To Blender

I have purchased a few models to help make the levels a bit more interesting.  It is supposed to save me time so I don't have to create them all myself but it has taken me a while to find a good method to get the models in to Blender.

The sets of models come with loads of different file types for each model.  The most useful being Autodesk FBX, Collada DAE, Wavefront OBJ and Lightwave LWO.  The trouble is none of them are fully compatible with the Blender importers for them! 

 

 

Some of them just cause errors and will not import, some import without their textures but by far the biggest problem is that the models import as one object with multiple textures.  The latter looks OK but the moment you edit them the textures get combined on to just one of the supplied textures spoiling the look of the model!

I have a solution for most of the issues with static models but I have not got anything reliable for the armature and animations in animated models.

Getting the model in to Blender

I found the best solution for static, non-animated, models was to use the Autodesk FBX converter (v2013.2).  Take the FBX file as the source and output to an OBJ file.  The result is compatible with the OBJ importer in Blender (2.63a).



It's very easy to use, just open the source FBX shown on the left, select OBJ on the right hand side and press convert. 

Animated Models

As mentioned above I do not have a reliable solution for these.  I would welcome hearing from anyone who does.  Luckily the animated models I have purchased, so far, have always had a version of Collada compatible with Blender supplied with them.  I can fix the other problems with the imported files using the techniques that follow.

I have tested converting FBX files to Collada using the Autodesk FBX Converter mentioned above and the results have failed to open in Blender!

As far as I know OBJ files do not contain any animation information.  You can convert the mesh to OBJ but the armature and the animations are lost.

I cannot find a way to scale animations and keep them working!

Textures

Blender can find the textures more reliably if they are all in the same folder as the blend file.  The first job I do at the start is to find all the images from the originating folders and copy them in to whichever folder I have created the blend file in.


Some source files contain the paths used by the author on their machine.  That is not very helpful.  You can use a text editor to fix this but I find that converting the FBX file as described above usually fixes that problem at the same time.

Importing

No need for instructions for this bit.  Its just select import from the file menu, the import type, Wavefront (.obj) or Collada (.dae) and select the file.



A word of warning.  Animated models can take a very long while to load.  If Blender appears to hang, give it a chance.  For one model I had to wait nearly 5 minutes but it did load eventually.

Rotation

If you've used the importer on the model you should have it in Blender looking correct but it might still have some problems.  Most common is that lots of other 3D modelling editors use the Y axis for up but Blender uses Z for up.

When imported in to Blender an Object rotation might have been added so the model looks the correct way up.  Check what the rotation is in the Transform window.


Alternately, the model appears laying on it's back but the Object rotation is set to zero.


My preference is to end up with no rotation in Object mode and apply any rotation needed in Edit mode.   That is required if you want to use the model in XNA.

Blender has a simple function to sort this out but as is common with Blender it is lost within all the other features.  See the apply menu which pops up when you press Ctrl+A.

First get the model looking the right way up in Object mode.  If the model is laying on its back usually all that is needed is to manualy type in 90 in to the X axis field of the rotation transformation.


Once the model appears the correct way up and if it has a transformation you can apply that transformation to remove the Object mode settings and apply that to the vertices and armature. (Ctrl+A)

In Object mode select the model that needs rotating.  It the model has an armature also select that at the same time.  I usually select the mesh first then the armature.  I don't know if it makes any difference.



Make sure the cursor (or whatever point is used for rotating and scaling round) is at exactly the same point as the origin of your model. 



Move the model and the cursor to zero first or probably better still use the pivot centre that rotates and scales round the model.


With the rotation point in the correct position, from the Object menu, Apply, Rotation.

The model will stay the same but the transform settings will change to nothing.


If the model is animated, now is a good time to check that the actions still look correct. 


Scale and Location

I would also recommend removing any scale and location before you start using the model.  This is Just so you have a clean slate when you start.  It is almost essential to do this if you want to use the model with XNA.



In object mode you can just edit the fields to a scale of one and location of zero.  There are similar methods to that described for rotation to apply location and scale. 

If you want the model to stay still and just the origin move apply the location (Ctrl+A L).  I find that useful for laying out levels.  If you want the whole model to move to zero then manually edit the location.

I have had mixed results with scale but mainly because the results can be dramatically affected by the rotation point.  Make sure that you know where whatever rotation point you are using is.  That rotation point affects what the model will look like after applying the scale.  If all three scales are already the same yiou can just set them all to one and adjust the size of the model in Edit mode.




Separate The Textures

Although at this stage the model should look correct, if you were to try to manipulate it in Edit mode you would find that the UV mapping would get messed up.  For some reason unknown to me the model is imported as a single object with multiple textures.  I have no idea how to deal with a model like that in Blender.

I find the simplest solution is to split the model by its component textures before it gets muddled up.

It must be a common thing because Blender has a simple method to do this called Separate.



Select the model, go to EDIT mode and press P to separate.  From the drop down list on the popup menu select 'By Material'.  Job done.  Now you have one object per texture.  Much easier to keep the model looking as it should.


That's it.  A few steps to get there but the model is in a usable form within Blender.

Sunday 9 September 2012

Texture Names

I'm doing more and more of my level design within Blender.  The process is much quicker.  I can import nearly any model type and line it up as I want then export to a single FBX file for use in the game's pipeline.

I slightly changed my XNA code to sort models at the mesh level rather than the model level so I can load FBX files containing multiple models and multiple meshes.  So far it works well.

In the process of doing this I notices that most models use a suffix to the texture name to indicate if the image is the diffuse texture or a normal map or a specular map etc.  I had been using a prefix.  I don't know if it will be useful in the future but I decided to do the same as everyone else before I had gone too far.

When renaming the UV map in Blender you need to be aware that the name of the texture on disk and internal name used by Blender are not necessarily the same.

 

When you replace the image you can see the old name used which will not have been renamed.

To ensure that the name used in the FBX export and the name on disk and the name within Blender are all the same I do the following:

  • Open the blend file before changing anything.



  • Rename the files on disk.

  • Rename the UV maps in Blender.
You can just type over the old name with the new name and hit return.
The final crucial bit is to...
  • replace the file used in Blender with the renamed one.


I use the same file name as the name shown in Blender.


In my opinion having the same name throughout saves a lot of confusion when transfering the exported FBX file and the textures for use in the XNA pipeline.

Thursday 30 August 2012

Consequences

I made what I thought was a tiny change to the code to fix a problem and now when I test on the Xbox the steady 60 frames per second (FPS) sometimes drops momentarily to 58 FPS.  This is not noticeable in game but I know it is there.

It demonstrates how close to the limit my current code is.



The change was required to fix a problem where a small model in some positions viewed from some angles would disappear from view.  It was either a bug in the frustum to sphere intersection testing code or I was just being over zealous with what I culled from the view.  Whatever the problem was, it was solved by increasing  the area used for testing what was potentially in the view.  The consequence is that more models are likely to be drawn each frame.

Those few extra models being drawn was enough in some places on my first level to tip the balance between drawing everything in 16.6 milliseconds (one frame) to not being able to draw everything in that time.

I'll tinker with the code to try to fix both problems.

Sunday 19 August 2012

Follow My Own Advice

I've just spent several hours trying to find out what was wrong with my navigation code in my game.  It had worked up until the last changes I made to the level.  This time when I tried to calculate the navigation mesh I got an invisible wall blocking my way!

I could not see anything in my code and none of my debug output showed what could be causing the problem.  Eventually I found it.  Not my code!  It was that one of the models I had exported from Blender had a rotation  left on it!


The rotation in the top right of the above screen shot from Blender is what had been causing me problems.  I will repeat for my own benefit...

Before exporting for use in XNA make sure that none of the rotaions or scale have been set and make sure the object is located at zero.

That's what it should be.

I don't know why but the model appeared to look correct in XNA.  When I displayed where I calculated all the triangles were they were at 90 degrees to where the model was rendered!

Rotating the model in Blender and exporting again fixed the problems.

I'm happy it was not my code but annoyed I didn't spot it while creating the model!

All models need to be created just like my animated export instructions:
http://blog.diabolicalgame.co.uk/2011/07/exporting-animated-models-from-blender.html

One more thing...



remember to add the Edge Split modifier to ensure square edges appear square when rendered.

Thursday 2 August 2012

Balancing Act

Over the last couple of weeks I've been designing and modelling a set of 3D structures to use as the alien interiors.  I wanted them to be consistent, sci-fi looking but also slightly alien rather than appearing like they were something a human would construct.  I'm getting there but my inspiration is limited somewhat by my technical and artistic ability.





Being consistent they can all share a very few number of textures.  This has the great advantage that if they are created as one model it only needs one draw call to the graphics card to display them.  The Xbox 360 can handle huge numbers of triangles but gets much slower if it needs to make loads of draw calls.

I got a bit too carried away trying to get the minimum number of draw calls...

After having returned from the Olympic 3 Day Eventing final on Tuesday I started work on laying out another section of what will become the first level of the game.


Olympics
My wife is the horsey person which is why when the chance to get tickets to the 3 Day Eventing came up we took it.  It was a good day with a fantastic atmosphere.  Team GB got a silver medal.  The stands shook with excitement.  I was very impressed with how the whole thing was organised.  There were lots of people but still everything flowed, we quickly passed through security in to the grounds of Greenwish Park and to our seats. 

The view from the stands was perfect and even with the few minutes of rain we had from time to time the brollies did not block the view.

My wife has some pictures and more info. on her blog:
http://tomandhenry.blogspot.co.uk/2012/08/greenwich-park-equestrian-eventing.html


I use Blender to create sections of the level and then import them in to the game and move them in to their final position using my own editor.
http://blog.diabolicalgame.co.uk/search/label/Editor


I've had to redo a lot of the work I did in Blender on Tuesday night.  Perhaps the excitement of the day got to me!



I had joined all the models together too early on and moved vertices in Blender to layout a large sections of the level.  I did this with the intention of making the draw calls more efficient but that was a mistake.  The single model is now too complex to separate and adjust sections and I had forgotten to allow for shadowing!

The Diabolical engine does not self-shadow.  This is where the triangles of a single model cast shadows on to other triangles in the same model.  I have other posts on the unsightly effects caused by self-shadowing so I won't repeat that here:
http://blog.diabolicalgame.co.uk/search/label/Shadows



3D Modelling for the Best Game Performance
Keep all models as separate component Objects so they can easily be picked up and moved about while laying out.  Combine them [Ctrl-J in Blender Object mode] at the last minute in to a smaller number of meshes that use the same texture to reduce the number of draw calls required by the game engine.

Remember to keep a copy of the source model file before the Objects are merged to make future adjustments easier.

3D Model Meshes
There is no advantage combining Objects that use different textures.  Each material will automaticaly separate to another mesh in the exported file because typically shader effect files only use one diffuse texture.

3D Modelling for Shadows
In the current Diabolical engine there is no self shadowing and this may apply to other engines.  When combining the Objects look at which Objects might cast shadows on other objects and make them separate models.





For my game engine I need to balance the performance gain from having less draw calls for the Xbox 360 against the visual quality of having shadows cast correctly on to more surfaces.