Friday, December 29, 2006

A first look at open source, decentralized version control systems

I am having a first look at this Quick Reference Guide to Free Software Decentralized Revision Control Systems, trying to find some that has both windows and unix clients, can be used without being permanently connected to the server, or installing anything on it, and are suitable for 3D projects.
What I want is to be able to host a 3d project repository using a simple shared web hosting, which gives little more than FTP, SQL and PHP.

These are my conclusions:

Checking monotone´s links I have found A Simple Version Control System which is entirely PHP/SQL based, with file access through FTP. This looks like the best for now.

Darcs has a very interesting approach, clients can read from the repository through "file system, (...) HTTP or email", and submit patches (diffs) through email. Also worth checking.

Bazaar allows to add, commit and recover binary files and will allow you to plug in support for diffing or merging, but it is "primarily a source code control system, not a media archive system". Anyway, it is worth checking it.

Libresource seems very interesting, but complicated.

Subversion and (related) SVK need to adjust the configuration of Apache via its httpd.conf file, which is not available in my case.

Monotone does not use a central server at all, rather clients selectively communicate changes to each others using the netsync protocol. I´d rather go for a more user-transparent system.

Codeville does not support binaries yet (it lists"support for binaries" in its todo list

Maybe CVS is an overkill, other alternatives are:

Unison which seems a bit problematic on windows and needs peer to peer connection.

Jon´s open source render farm

Jon, from the Bournemouth University, sent me an email:

"(...) at present we are developing a production pipeline for our Masters in computer Animation courses (...)
You can check out my website here and you may be interested in the open source render farm I have written (click on the Render Farm link under the masters section) (...)"

Thanks for the info, Jon, and check his page for news!


Jorge just sent me a brief note:
"I Imagine you already wnow"

And I just replied
"Rats, no"

But I will have to go over it as well...

From their web:

Animation and Effects Production Pipeline

openPipeline is an open source framework for managing animation production data and workflow. It's first implementation is a MEL-based plug-in for Autodesk Maya that handles specific aspects of production: automatic directory structures, file naming conventions, revision control, and modularity that makes multi-artist workflows possible.

Announcements on cgtalk and highend3d

Friday, December 22, 2006

Book on Animation Production

I have started to read a book on Animation Production, not surprisingly entitled "producing animation"
The authors (Catherine Winder and Zahra Dowlatabadi) seem to have done an intensive research on the subject, and overall it seems a quite practical approach to the matter.
I will tell you more when I have finished reading it.

Thursday, December 21, 2006

Software in the Animation Industry Data Base

As Diego suggested, i checked AIDB site looking for existing software in this field.

There are 32 entries in the Workflow / Project Tracking section, but the info is a bit outdated and inaccurate.
Other than Alienbrain, I couldn´t find any program that fitted in the category.
Anyway, these 3 projects somehow called my attention:

Wiredrive Projects: A project-based client area used for reviewing and approving creative work and production documents.

PECS Tools Suite is a "Data Pipeline Management Solution": A software and hardware solution for the management of Motion Capture Studios, including Planning, Shooting, Team management of post production, Asset management and version control, Quality control and Delivery scheduling.

ReviewManager is an outsourced client-review tool

Wednesday, December 20, 2006

Question & reply to Diego García in cg-node

This is my feeble, semi-automatic, hand corrected translation of the reply that Diego García Huerta (Senior Pipeline & Tools Developer at Blur Studios) gave me in this forum

I asked him to help me with my research, with:
(A) Bibliography
(B) Software (programs)
(C) References (interviews, webs, manuals, whatever)

1.- A general approach to analyzing a business workflow. A recommended generic reference book ("The bible of...") that allows to settle a proper theoretical background to build on top of it.

2.- Detailed analisys of 3D production in different companies. I try to define a general workflow, based on real cases.

3.- Programs that allow to manage tasks, assets, processes, files, etc. Specially GPL ones.

4.- Nomenclature conventions.

Diego replied:
This is an interesting subject for a thesis, for sure, and indeed more research is needed in this field.

Obviously in order to organize a studio with over 40 persons you need some sort of help in order to speed up the search for files, know in real time what is going on in the production, automate processes as asset creation, assignment of working hours, semiautomated budget calculation, etc.

To begin with, a good directory structure avoids the production growing as a wild plant out of control.
We have a typical structure of directories based on Project / Sequence / Shot. In each shot you can find all the necessary files for layout, animation and scene assembly that are normally referenced by character´s meshes and rigs.
Oher studios follow the same methodology, but it is difficult to find out exactly how a CG/VFX company pipeline, besides some glimpses in the DVDs extra of some films, mainly CG.
Perhaps the best way is interviewing people from those studies, and check wether this is an open subject, as you are doing with me!

After the Directory Structure, you need a Naming Convention (NC)
I have heard terror histories in Disney where a long time ago it simply consisted in a few letters and numbers.
Here in Blur, the name of the 3D objects, files, etc. are more or less indicative of their content (which complicates their management from the point of view of programming, and is more vulnerable to human error)
Other studies decide to put everything in the name of the file, from the person that uses it, to the day that was recorded.
I personally plead for clarity, neither too cryptic, nor "everything in the name", although it means more problems for the programmers
Here in Blur we have our own tools that put tags to the 3D objects, or files, so that with said tags, we can automate processes in a simple way, or identify different types of objects, like for example the meshes that have to be exported as a cloud of points for scene assembly.
An example of NC of an object:
With this name you can tell it is a "Templar" character asset, with probably other two duplicates of this asset on the scene, and that this specific piece is the left leg, ready to be exported as a Point Cache from an animation file.
I know this sounds like madness, and it could be of course if artits didn´t have tools that add/remove or automatize tags in the names of the objects, but if this is followed throughout the whole pipeline it takes care of many of the typical problems, and the possibilities are multiplied when you can create tools that automate processes very easily.

Then comes the Asset Management System. This came out two or three years ago, and each studio deviced its own system, including us here in Blur. It´s interesting to see how different and unconnected studios come up with the same ideas over and over, we are still far away from a DAM ( Digital Asset Management ) industry standard

It is also interesting to point out that the videogame industry stepped into this subject long ago, maybe this is related to the fact that they are more programming-oriented, they started with check-in/out systems and then evolved to DAM systems.
Web contenct publishing is also very advanced in this subject, there are many apps, even GPL, look for MAM (Media asset management), DAM, or CMS (Content Management System).

Looking at your blog I see you have some interesting links, mainly the article on Final Fantasy, a clear example of how easy it is to start a DAM and how hard it is to finish it.

Digital C.O.R.E started their own one for the film “The Wild” (not there as they called it in Spain) and when the film was over they still hadn´t finished, it turned out that in the end it was easier for them to redo it from the ground up with all the experience that they had acquired.
Another recent example is “Barnyard”, where Omation studio began with their own system simultaneously whith the production, and as you will imagine it was a little chaos. At the end of the production they had something decent that I guess they will be able to use if they are make some other CG film

As for bibliography, I can recommend these two books, specailly the first one is very useful as an introduction:

Implementing a Digital Asset Management System: For Animation, Computer Games, and Web Development
Focal Press (August 26, 2005)
ISBN: 0240806654

The Game Asset Pipeline (Game Development Series)
Charles River Media; 1 edition (September 2004)
ISBN: 1584503424

About commercial software, the best known is Alienbrain VFX, I can´t tell you how good or bad it is because I have never used it, but it has integration with XSI, MAX, Maya.
I would look for in the Animation Industry Database to see if you find something useful

Good luck with the thesis, and keep me informed, it is a subject in which very I am interested!

Tuesday, December 19, 2006

University research

I am very interested in contacting research groups in Europe that might be working on related subjects.
I will begin going over this list

Ares own contribution

A new message form Ares brings a couple of new (to me) pieces of software. Here goes the usual translation:

An interesting production tool is "interactive storyboard for maya"
ISFM offers a fast and visual general look at the project, you can check what is already done, and what status it is in, and also what has to be done yet.
It has an obvious connection with Maya, but also with Excel (statistics), shake (postproduction), ...
The integration with Maya is absolute, allowing to run scripts on a batch of scenes, do playblasts, an FTP system, visualize the storyboard with the Maya scenes, icon view of 3D files, sound, video form other sources, transitions...
This is a ISFM workflow scheme

Another tools is Reflex, although it is not for sale yet (I think it will come out in January 2007)
- It has tools to facilitate the workflow,
- Some very useful ones for supervision and data exchange among departments,
- Some animation tools identical to Jason Schleifer´s "greasyPencil"
- And file management tools that allow to block characters functions depending on the department you belong to.
They are asking for beta-testers (advanced students and animation studios)

Saturday, December 9, 2006

Another software hint from Miguel Angel Sánchez Cogolludo

This software is starting to be used in some new production houses (old ones have their own software): Gdi|Explorer
The web page says it uses a widely adopted naming convention. There is a trial version.
It is quite complete, with task assignment, and it can even understand a Movie Magic Screenwriter´s "tagged script", making folders for the assets indicated in the script.

It is quite good, althought Miguel Ángel would like it to be tidier creating categories.

Juanma Sánchez book suggestion

Juanma suggests to consult some books on "project management", as managing an industrial project or software development is very similar to managing a complex animation production.
He recommends "La guía definitiva de la gestion de proyectos" de Nokes y Greenwood, Ed. Prentice Hall. (in spanish) because it is a simple and straightforward reading.

This is the amazon page of the english version.

Monday, November 27, 2006

A couple of articles in XSIBase, sent by Luisma

This is an XSIBase interview to Diego Garcia (Blur), it might shed some more light onto the matter.
Diego is answering questions (including mine) at a "know the artist" thread in (in spanish)!

And another interview to Graham Clark partly related to pipeline creation, that might give some interesting conclusions.

Book suggestion form David Llopis

3d Short Film Production is a book about the steps to follow to create an animated short film:
History, preproduction, file nomenclature, tools, and all subsequent steps.
It is very interesting and it can possibly help you

Ideas from Luisma

I asked for ideas and suggestions in a 3D users forum, and Luisma sent me his own personal opinions about the matter of workflow and pipeline.

This is a resume, loosely resumed and poorly translated by me, and with my own comments in italics.

Luisma´s comments are from the point of view of a mere user, as he is not a programmer nor an expert. Also he is considering a type of production with several departments that will be exchanging or sharing assets most of the time, both internally and among departments, each using its own programs and tools, so it is a rather heterogeneus system.

An efficient pipeline will be mainly based on keeping an strict discipline regarding name conventions (nomenclature).
This will make easier for everybody to:
1.- Find assets to work with.
2.- Keep track of the versions of each asset.
3.- To check and send assets along the pipeline to the next department.
4.- Facilitate writing scripts and programs that work with assets.

This is mere common sense. Now lets go down to the practical side of it:
This is the list of "digital" departments ("non-digital" would be Art Direction, Concepts, Storyboard, Sets and Shooting, etc) that we can find in a medium-big production, sorted in chronological order along the pipeline:

  1. Previs - Visualizes the directors´s idea
  2. Modelling - Creates models
  3. Creatures - Rigs the model
  4. Cameras and Layout - Places the model on the scenery and frames it with a camera
  5. Animation - Animates the model (normally a low-res version). Once approved, the animation will be "published" and passed onto a hi-res model.
  6. Coding
  7. Texturing - Creates textures
  8. Matte Painting
  9. Paint+Roto
  10. Shaders - Creates shaders
  11. Lighting (& FX?) - Applies Textures and shaders to the hi-res model, and lights the shot according to its visual characteristics, and rendering the appropiate passes
  12. Compositing and colour grading - Finalize the shot mixing the rendered passes
Scanning+Recording, Production, Technical Support and several others, including the Pipeline Engineering Dpt. itself, also have access to the pipeline. (I think these Dpts. are placed out of the main schema because they are not "digital content creators")

The best way to organize this huge amount of information is through a unique Project directory, with several main subdirectories for reference and query, that allow for continuus access:
And then two "working" subdirectories:
/Assets: All the elements (characters, props, sceneries) with its corresponding revisions, kept up-to-date and related to other assets. For instance:
/Shots: All the production shots (/P01, /P02.. /Pxx) with subfolders for each dpt(/Models, /Anim, etc), for instance in Anim, several folders pointing to the "Assets" subfolder containing the needed assets: ( EDLs, models, cameras, audio, rigs, etc)
A Pipeline Script would assemble all that into a 3D scene for the program used in animation.
An aditional folder at that level is /Work, where the output work is saved into three folders (/Initial /Progress /Final) with the adequate versioning nomenclature: Pxx_CharacterA_lowres_v05_NA_ipxx.htr
where Pxx is the name of the shot, NA the animators initials, ipxx the version or take.
Once the shot goes into the "Final" folder, it is ready to be published for the next step (ie lighting)

This looks like a mess, and it can certainly be at first, but getting used to it it all makes sense, and speeds the processes of looking for assets, checking them, publishing them...
The less margin we leave for human error, the better, and aplying a logical methodology saves a lot of problems. There are many things that computers are better at than us, a copule of scripts here and there make your life much easier.
To sum it up: be rigid, with almost dictatorial discipline, when naming and placing things. Don´t allow licenses with these sort of things, a mere error or foolness can potentially become a huge mess.
Giving for granted the professional level (regarding discipline) of the team, the Pipeline must be at the service opf the team, and not the other way around, as it sometimes happens with so-called "pipelines"...

Of course this does not only relate to nomeclature, data structuring and such, but it begins there.

I don´t know of any tool that allows to create an "ad-hoc" pipeline, and its complexity will depend on a number of factors: The number of departments that will need to share/exchange assets, the programs they use, the kind of data each dpt. publishes, etc. The tools that will be written to configure the pipeline must have all this into account, as there aren´t two studios alike, one tool can be too small or simple for some and too big or complex for others

Thursday, November 23, 2006

Interesting suggestions by Miguel A.S. Cogolludo

Miguel A.S. Cogolludo has sent me a good wealth of information he has been collecting lately: "Casually this is a subject that has caught my interest lately, and I have some links and files that might be of interest for you"

"There is some people that started trying it, but I haven´t seen progress for some time, the Koji system".
Looking into Koji two more projects are shown: celtx and otc

"This are two essays about Final Fantasy: Tracking Assets and Developing a Production Tracking Database"
"Another interview about the making of Final Fantasy"

Zeno, the new system of ILM (registration required).
And also in this three articles in awn: #1 #2 #3

Interesting discussion about workflow management in Framestore

Wednesday, November 22, 2006

Box office hits: Are 3D films getting worse?

In this chart (data from boxofficemojo) you can see the films that have done (worldwide theatrical box office) above 200 million US$ after 1985 (Toy Story).
Green dots indicate 3D movies.
"Titanic" (1,845 Mill. US$) has been removed to gain detail.

You can see that 3D movies between 1985 and 2002 perform between 350-550 Mill US$. Suddenly comes a top with "Finding Nemo" (2003, 854 Mill. US$) and "Shrek 2" in (2004, 920 Mill. US $), and right after that comes a severe drop, with some films performing under 250 M.

Moving on to a more subjective analisys, I have a feeling that 3D movies are getting worse every year. This feeling comes from two factors:

- There are too many 3D films per year lately. Until 2004 there were 1 to 4 films per year. Then the number rises steeply: 6 in 2005, and 11 so far in 2006

- Most of these films have no appeal at all: Aren´t you bored of "twin films" about "funny animals" (2 x ants, 2 x fishes, 2 x penguins...)

- On top of that, living in spain means an even worse handicap: Films are dubbed into spanish by humorists or actors, not by professional dubbers, and they even dare to put their own jokes into the film! As the number of films per year increases, this is getting worse and worse.