See part 2 here

See part 4here

Before getting into daylight renderings and searching for optimized solutions we should get a simple cloud rendering working correctly.

The graph below shows the simplest setup to get a render job uploaded and to view the resulting image.  You can check out this sample from within Dynamo.  Go to Help menu -> Samples -> 25 Rendering.

I’ll also be packaging up my custom nodes from these daylighting posts and putting them on the package manager.

The main parts of this graph are as follows:

we specify a view name that we want to render, in this case {3D}, as a string.  This gets passed into the export render data node, and the cloud rendering job node.

First a render file is exported, and then uploaded to Autodesk360.

Then we specify a height and width for our rendering, you must specify above 110 pixels in either axis.

We use the render type and rendering quality nodes to specify strings for the type and quality.

The cloud rendering job node produces a job we pass to the cloud render node.  This node actually initiates the render, the graph execution will pause at this node until the render returns, times out, or passes an exception.  At completion of the render it will returns a list of file paths to the rendered images on your local machine.  Here I use first of list to grab the file path, and then read image to load the image into dynamo, so we can display it in the graph.

Some caveats:

For these sample graphs make sure you are in the project environment of Revit.

Rendering using Dynamo with an Autodesk education account will allow you to make unlimited renderings.

These renderings cost cloud credits if you do not have an edu account – but draft quality renders should be free.

By changing the rendering type and leaving all other parameters the same we can generate an false color illuminance render easily.

Next we’ll look at using the daylighting rendering tools to build a minimal graph and get daylighting data back that we can use to drive a Revit model.

The graph below is a simplified daylighting workflow.  It contains all the pieces necessary to create a daylighting rendering job, upload it, and inspect the results. It’s a bit more complex than the renderings and provides us with more data to analyze, not just an image, but it requires more setup.

I’m going to step through the distinct pieces of this graph.

I’ll start with the Export Render Data node, this does the same as in the rendering graphs, it exports data about our model and view to render.  I hook a Transaction node up to this to stop the dynamo UI from hanging up while the render is in process.  Then this is uploaded to a path online, same as before.

Instead of a Cloud Render Job, we use a Cloud Daylighting Job node.  This takes a few more inputs:

1)viewname is the same string we used for the Export Render Data node.

2)x and y divisions are similar to resolution, but since this render job does not explicitly create an image only, we should think of these more as divisions of our sample grid.

3)bottom left, bottom right, and top left, are xyz points defining a plane.  This is the plane we are going to sample illuminance values across.

4)environment is a daylighting environment object tells the rendering engine where our site is located, what time of day we are sampling at, and what kind of atmospheric conditions are present, as well as the type of sky model we are going to render with.

In the graph below I’ve set the divisions to 150, this is the minimum sampling division.

My plane points are Bottom Right =  (-100,-100,3) Bottom Left = (100,-100,3) andTop Left = (-100,100,3).  These define a 200 unit by 200 unit plane at 3 units high.  (Note that in the US. the workplace for daylighting illuminate evaluation for an average desk is usually set at 2.5′ (.76meters.))

Now let’s take a look at the environment and sky model setup.  These are the nodes you must configure to get a meaningful and accurate daylighting analysis.

The Perez sky model is the only type currently built into Dynamo but you should be able to pass a string into sky model type referencing this document to get other sky models if needed :

http://sustainabilityworkshop.autodesk.com/buildings/revit-illuminance-simulations

The Daylighting Skymodel node returns a sky model object, we need to give it some atmospheric values and a sky model type.  It will use some default values if you do not input anything for the atmospheric values, but then the analysis will not be accurate.  These values can be found on an hourly basis for a specific site by using Green Building Studio to generate a weather data file, there are also other online databases that make this data freely available, for example you can find this data from energy plus or the department of energy.

For now we can leave these set to default values or look them up and set them manually, and we’ll deal with parsing that data  and setting it automatically later.

If left untouched these are the default values.

Diffuse irradiance value in W/m^2, 208

Diffuse irradiance value in W/m^2,784

Global irradiance value in W/m^2, 959

We use a dateTimeObject to set the specific time of day and year that we are sampling daylight values.  We must manually specify the time zone offset from UTC time, in this case I am rendering at 40 latitude, -71 longitude, which is within Eastern Standard Time Zone.  This time zone is 5 hours behind UTC, We enter a time of 12pm to render at – this is within our local time, then we enter the 5 hour time offset and you can see the date time object that is returned is at 5pm, in UTC time.

Both of these get hooked up to the Daylight Environment node, along with the latitude and longitude of the current site.  Note that north/east are positive, while south/west are negative.  Also note that loc x is latitude, and loc y is longitude.  This node returns an environment object that we pass to our cloud daylighting job node.

Like in the previous cloud render graphs, the Cloud Render node handles uploading the rendering, monitoring the job, and pausing graph execution until it returns, this node can actually pass lists of jobs.  We could create multiple jobs and pass them to this node to upload them all at once.  That is why this node returns a list of results.

When the node returns a result,  and resumes execution of the graph we pass the result to Parse Daylighting File, this node parses the file at a path and returns a format we can parse into an image, a list of colors, illumianace values, etc.  In the graph below I we use Get Daylighting Grid Image which returns a bitmap image where the colors at each pixel are scaled illuminance values, the resolution of the image is equal to that of the sampling density we set before, so 150×150 pixels.

Between the Get Daylighting Grid Image node and the Watch Image node, we use a First of List node to extract the first image.  This is necessary because we passed a list of file paths into the parse daylighting file node.  Even though the list only had one item in it, so the Get Daylighting Grid Image node, returns a list with one image in it.

So how do we drill down into the data that we’re getting back?  The Get Daylighting Grid series of nodes get the daylighting data parsed from a daylighting render.  They return the data at every sample point in a single list.  This list actually starts at coordinate 0,0 and goes left to right, bottom to top, of the sampling plane.  The last coordinate is 150,150.

To get an illuminance value at a specific x,y coordinate within the local space of the plane we use a simple formula: Row * NumberOfColumns + Column (X position = Column, Y position = Row).  We use the Get Daylighting Grid Resolution node to get the number of columns.  The result of this calculation is an index into the list of illuminance values.  Note that we used the Illuminance to sRGB node to convert the red,green,blue components of lux into a single weighted illuminance value.

We use a similar lookup strategy below, but this time we can use a world coordinate XYZ position to look up the illuminance.  Here we’re looking up point (-100,100,3) and finding the index of this position in the list of positions that were sampled.  We then use that same index to lookup the illuminance at that position.

IndexOf is a custom python node that will give us the index of an XYZ within a list of XYZs.

This code uses the next() function and enumerate() to check each item in a list of XYZs, if the item IsAlmostEqualTo the value we’re looking for, then we return the index of that item. We use XYZ.IsAlmostEqualTo because I found that XYZ.Equals() would always return false.

If you want to learn more about this code the python docs are useful: http://www.itmaybeahack.com/book/python-2.6/html/p02/p02c08_generators.html

here are some other resources for solving a similar problem:

http://tomayko.com/writings/cleanest-python-find-in-list-function

for some more information on using python in general with Revit check out Nathan Miller’s site here:

http://wiki.theprovingground.org/revit-api

Theres also a Dynamo only solution to this problem that you could try as an excercise: we could zip our two lists together so they look like:

(xyz,illum)

(xyz,illum)

etc.zip

We would then write a function to check the first item for equality with the value we want to find, if true we get the second index of the same sublist.  This would be the illuminance value.  This workflow is nice because we never even have to record or think about what the index value was, we just filter the zipped list of pairs by a function that returns true if the first value is the XYZ we’re looking for.

Last let’s look at pushing some of this daylighting data to the Revit model, we can do this with the analysis display style. First thing is to go into your current Revit window and create a new analysis display style that plots data on a surface as a series of colors at (U,V) pairs.  We need to specify the colors we’re going to map the values to, and the type of display to use.  We are going to plot our data along a face.  We’ll use the floor to keep it simple.

The first thing is to generate the same number of uv positions as sampling positions that we took.  To do this we use Select Face to get a reference to the floor face.  We use a custom node Display Lighting Data on Surface. We input the face we want to plot data on, the xyz positions of the sample points, and the lighting data.  This node uses a bit of python to project each xyz position down to the face, if it intersects then it saves that lighting data and pairs it in a new list with the resulting U,V coordinate of the face.  We have to wrap this node with a transaction because the analysis display surface node requires a transaction.

This will plot analysis data on the floor face, we can also do the same with analysis display points, but beware since each point is an independent object it will slow down Revit.