Posts by retug (34)

E2R - ETABS Revit Connection Tool

After the development of R2R, RAM Structural to Revit connection tool, I wanted to make an E2R, ETABS to Revit connection tool. Below are some screenshots of the newly developed tool.

I made a youtube video on how to use the tool here. This video goes in depth on each button and things to look out for while using the tool.

The tool will be available on the Autodesk App Store soon hopefully and freely available to download and test out. Please let me know if you encounter any bugs as you use the tool.

The tool works to check ETABS beams vs Revit beams and is ideal for large steel framed structures in ETABS to quickly backcheck framing sizing and discrepancies between ETABS and Revit.

The Coding - More To Come

MVVM Magic and Modeless Mayhem in Revit

After making the RAM to Revit tool, I wanted to make a similiar tool, but with ETABs this time.

I learned a lot making the RAM to Revit tool, but also learned a few things that I would like to do differently given the chance. 

MVVM Magic

MVVM standards for Model-View-ViewModel and it serves as a way to effectively decouple the user interface (in my case my xaml code) from my data (Model). You do this through creating a ViewModel that allows the model and the view to be completely separate. 

I will admit, I am speaking like I know what this means, but it is still somewhat of mystery to me, but I think this will prove beneficial in making the ETABs to Revit tool. The amount of coding I spent keeping the UI and the data on the backend "sync"ed up for the R2R tool started to become overwhelming, and it is my understanding that MVVM will be a much cleaner way to keep the data and the view separate.

Below is a gif of the view being dynamically updated with updates that occur in the revit model, really cool stuff. Big thanks to scgq425 on the revit api forums for helping me out with this.

Sample code on github.

Learning Points with MVVM and Revit

I struggled with getting my MVVM code to work because I was using a IEnumerable (does not inherit a lot of the MVVM goodness) when I should have been using an ObservableCollection to store my revit beam data.

public ObservableCollection<RevitFramingModel> StructuralFramingElements
        {
            get { return _structuralFramingElements; }
            set
            {
                _structuralFramingElements = value;
                PropertyChanged?.Invoke(this, new PropertyChangedEventArgs(nameof(StructuralFramingElements)));
            }
        }

I also struggled with trying to trigger the data to update when a change is performed in the revit document. After searching the revit api documentation, it seemed like the IUpdater interface would do the trick. From the revit documentation:

IUpdater Interface - The interface used to create an updater capable of reacting to changes in the Revit model.

I wanted to react to changes to my beams, so I thought this would do the trick. Throwing the IUpdater interface in the my View Model, my code seemed to be working.

public class StructuralFramingUpdater : IUpdater
    {
        static AddInId m_appId;
        static UpdaterId m_updaterId;
        private readonly MainViewModel _mainViewModel;

        public StructuralFramingUpdater(AddInId id, MainViewModel mainViewModel)
        {
            m_appId = id;
            m_updaterId = new UpdaterId(m_appId, new Guid("FBFBF6B2-4C06-42d4-97C1-D1B4EB593EFF"));
            _mainViewModel = mainViewModel;
        }
        public void Execute(UpdaterData data)
        {
            ObservableCollection<RevitFramingModel> structuralFramingElements = _mainViewModel.StructuralFramingElements;
            // Check if the modified element is a structural framing element
            foreach (ElementId elementId in data.GetModifiedElementIds())
            {
                Element element = data.GetDocument().GetElement(elementId);
                if (element != null && element.Category.Name == "Structural Framing")
                {
                    // Update the corresponding RevitFramingModel
                    RevitFramingModel framingModel = structuralFramingElements.FirstOrDefault(m => m.Id == element.UniqueId);
                    if (framingModel != null)
                    {
                        framingModel.Name = element.Name;
                    }
                }
            }
            _mainViewModel.StructuralFramingElements = structuralFramingElements;
        }
        public string GetAdditionalInformation()
        {
            return "Structural Framing Updater: updates structural framing models when changes occur";
        }
        public ChangePriority GetChangePriority()
        {
            return ChangePriority.Annotations;
        }
        public UpdaterId GetUpdaterId()
        {
            return m_updaterId;
        }
        public string GetUpdaterName()
        {
            return "Structural Framing Updater";
        }
    }

I used chatgpt to write a lot of this code, it was funny to see that chatgpt was essentially "copying" this portion of the web:

https://help.autodesk.com/view/RVT/2022/ENU/?guid=Revit_API_Revit_API_Developers_Guide_Advanced_Topics_Dynamic_Model_Update_Implementing_IUpdater_html

I noticed the the Guid between both programs was identical, pretty funny.

Modeless Mayhem

The next item I wanted to solve with the R2R tool was the lack interactiveness, when the popup window for R2R begins, you are not able to interact with your revit model. This is a huge downside that I want to come back to, but for now, I wanted to make sure the E2R would not be limited in the same way.

The fix to make the window "modeless", switching the popup window from 

mainWindow.ShowDialog(); to mainWindow.Show();

This simple little change did the trick. I think making the full fledged E2R tool modeless will be more complicated than this moving forward with multiple windows and popups, but for now, I hopefully optimistic that I can keep this application modeless with little coding brainpower spent.

ETABs to Revit tool coming soon... 

???

Autodesk Platform Services - A First Step

I'm not really sure what Autodesk Platform Services is or why I should use it, but Autodesk is pushing this hard it seems. The tech looks cool so I figured I would follow their first tutorial.

The first example they provide on APS is how to upload your local revit model to their cloud service and view the revit model in your web browser. Really cool! The example model shown in the gif above was the same model I built to show off the RAM to Revit tool.

A Return to Javascript

The documentation, found here on autodesk's github and the web example, is really well put together, I just made one stupid error that cost me about 4 hours of debugging and one stupid post to stackoverflow to try to solve. I have been meaning to come back to the concrete design tool for awhile which is written in javascript, so when I saw the tutorial recommended node.js and javascript as the first option, I hopped on the oppurtunity to refresh the javascript skills.

A few blog posts ago, I think I mentioned that one day I hope to experience less "firsts" in programming, this was a great example of less "firsts". I had played with all the tech stacks they utilized in the example, node.js, npm, javascript were all at least somewhat familiar to me. Finally! No watching 10 hrs of videos just to have a basic understanding of where we need to begin. 

Also, APS is built using three.js which I have experience with in building the about page on the website as well as the concrete design tool, how cool! It's crazy how all these tech stacks build on top of each other, finally starting to feel like a semi-experienced programmer.

APS Tutorial

As I had mentioned, the APS turorial was really well put together, but I will highlight a few things I found a bit unclear for my beginner programmer mindset.

First, make sure you are using powershell if you are on windows to input the commands in the documentation. Originally I was in git, which was not the right spot. The image below you can see git was working, but eventually I realized I needed to be in powershell.

The code is pretty advanced, so a lot of times where I had absolutely no I idea what I was copying and pasting. Throwing the code into chatgpt and asking it to explain it to me was helpful.

At the authentication step of the process, the docuemtation mentions that you should be greeted with a window that I was not getting. I should have realized that I had made a mistake and went back to fix it, but instead I forged (pun intended, APS used to be known as Forge) on to make my previous mistake much worse.

My mistake, in the .env file, you are supposed to store your APS_CLIENT_ID and APS_CLIENT_SECRET in a string. 
The code that is put up on the website looks like this:

APS_CLIENT_ID="<client-id>"
APS_CLIENT_SECRET="<client-secret>"

I replaced this as

APS_CLIENT_ID="<MYTESTCLIENT>"
APS_CLIENT_SECRET="<DKLJSADKNVDSLK15154541215F>"

Be sure to remove the "<" and ">", this stupid error led me down quite the rabbit hole trying to debug. See my stupid stackoverflow post here: 

https://stackoverflow.com/questions/78138283/autodesk-platform-services-access-token

My code is up on github, but it's just a copy of the autodesk code. Feel free to take a peek.
 

What's Next?

I've had people ask, why are you doing this? What's the end goal? Sometimes you don't need an end goal to enjoy the journey. For now, I'm going to keep going with the examples that are up on the APS github. It looks like there is some really cool dashboards that can be created and other exciting stuff. Again, nothing immediately jumps out as me as I need this to improve X/Y workflow, but I have to admit it is cool. 

Another item to research, how much will this cost? APS runs a currency called tokens. How much are tokens? How many tokens are consumed? I've got a lot of questions that I need to dig into to understand a bit more. I have a feeling though, with a name like "tokens" its gonna cost a lot. Maybe it's one of those, if you have to ask, you can't afford it type deals? Let's hope not!

 

RAM to Revit - First Update V1.0.1

The beta release of the RAM to Revit tool has gone well. I have a few people using the tool in practice and have heard some good feedback so far. I am actively trying to maintain this and make it better, so if you have any requests, please reach out. I will try to get your feature added.

Big Update #1 - Calculate X Y and Rotation Function Upgrade

The calculate X Y and rotation function has been greatly improved and does a much better job at trying to guestimate the rotational and offset parameters between your RAM coordinate system and your Revit coordinate system. My error, I had not converted the intersection point of the RAM gridlines into the Revit coordinate system which led to some really bad estimates on X and Y offsets. This has been revised now and should work in a majority of cases.

Update #2 - Mouse Tracking of Coordinate in the Mapping Plane

Should the "Calculate X Y and Rotation" button not do a good job of estimating the X Y and rotation parameters, I have added mouse coordinates in the lower left of the mapping plane. This should allow for easy hand calculations of the X and Y offset parameters should you not be aligned.

Update #3 - Additional Beam Info - RAM Beam #, Camber, LRFD Start and End Reaction

I had a user ask for some additional information the datagrid, RAM beam #, Camber, and beam end shear reactions. Eventually they want to map the start and end reactions into the revit beam for quicker documentation. Side note, I feel sorry for you east coast engineers, sometimes it's just easier to design the simple shear tab connections. 

I am working with this user to see how they like to schedule their end reactions and hope to have a way to write the start and end reaction parameter into the Revit beam soon. More to come on this. The reactions that is recorded is an LFRD factored load reaction that only looks at RAM Steel Beam gravity results. It checks both 1.4DL and 1.2DL + 1.6LL and nothing else. Use with caution.

I also have one user with a weird bug that will not allow them to access the RAM structural API.

I am hoping that this new update will give me a little more insight into why it is not working on their computer. I finally learned how to generate a meaningful error message using the "try" "catch" keywords in C Sharp. Hopefully the exception that is thrown is more insightful than my previous error message.

private void getRAMResults_Click(object sender, RoutedEventArgs e)
{
    if (ramFilePath == null)
    {
        MessageBox.Show($"You must select a RAM Structural model first, .rss", "Error", MessageBoxButton.OK, MessageBoxImage.Error);
        return;
    }
    //List<string> storyNames = RAMInfo.GET_STORY_NAMES(ramFilePath);
    try
    {
        List<string> storyNames = RAMInfo.GET_STORY_NAMES(ramFilePath);
        ramFloorComboBox.ItemsSource = storyNames;
    }
    catch (Exception ex)
    {
        MessageBox.Show($"An error occurred: {ex.Message}\n\nLikely errors include:\n\n1. RAM Structural is not installed on your computer\n\n2. Your RAM structural file is corrupt, check for .USR file \n\n3. Clear your working file directory, usually located here C:\\ProgramData\\Bentley\\Engineering\\RAM Structural System\\Working" , "Error", MessageBoxButton.OK, MessageBoxImage.Error);
        return;
    }
}

Section Cut Tool Upgrades

The Section Cut Tool got a big upgrade:
 

  • Ability to plot multiple load steps for a load case that contains multiple directions of loading.
    • Seismic X, Seismic X +e, etc.
    • Currently only supports ASCE 7-16 and ASCE 7-22 seismic loads. If you use this tool and want your specific code included. Leave a comment here or github. Should be a quick enough update.
  • Interactive plots with the data grid below (highlight a row in the datagrid and the corresponding plots will have a gold dot filled in).
  • Tabular data now outputs the length of the section cut.
  • Youtube video for install
    • Hearing yourself on a youtube video is just as bad as one might think it is, but hopefully this will help people get this up and running on their computers.
  • Youtube video for how to use the tool

The code has been updated on github and the install files have been uploaded as well. Give it a spin.

Coding

Coding this involved much more code than what I ever could have imagined. I should now by now that I should take my time estimates to achieve a task and multiply by about 5.

ETABs API - Database Tables, Not As Good As I Once Thought

Database tables seem to be the way the ETABs API is pushing coders to move to access data. This is good and bad from my beginners perspective. The good, most anything is available programatically. The bad, the API response is this garbled, unorganized mess of data. Take for example the section cut results.

I used to be able to call this function to retrieve section cut results before upgrading the tool to pull differing load steps.

This function was nice because it explicitly tells you what it will return. If you happen to throw in a load case with multiple load steps into this function, it will not work for some reason? I was able to get around this by calling the database tables method to return the section cut database table.

            string TableKey3 = "Section Cut Forces - Analysis";
            string[] FieldKeyList = null;
            string GroupName = "All";
            int TableVersion = 1;
            string[] FieldKeysIncluded2 = null;
            int NumRecords = 0;
            string[] TableData2 = null;

            _SapModel.DatabaseTables.GetTableForDisplayArray(TableKey3, ref FieldKeyList, GroupName, ref TableVersion, ref FieldKeysIncluded2, ref NumRecords, ref TableData2);
            //THIS IS THE LIST TO HOLD ALL LOAD STEP RESULTS FOR A SELECTED LOAD CASE WITH STEPS
            listSectionResults = new List<SectionResults>();

            //this is the multiple load steps
            if (FieldKeysIncluded2.Contains("StepNumber"))
            {
                //This list will contain all of the results from one load case with multiplesteps                
                
                List<int> indices = new List<int>();
                // Find indices of "EqAll" in TableData2
                int loadSteps = listBoxLoadSteps.Items.Count;
                indices = SectionCutResults.FindIndices(TableData2, LoadCaseComBox.SelectedItem.ToString());

                //int step = indices.Count() / loadSteps;
                // Grab specific items by index using LINQ
                var specificItems = indices.Where((item, index2) => index2 % loadSteps == 0);
                List<string> OrderedSeismicList = new List<string>();
                OrderedSeismicList = LoadCaseList[index].GetOrderedSeismicDirections();

                //run from 0 to load steps
                for (int i  = 0; i < loadSteps; i++)
                {
                    List<Double> F1loop = new List<double>();
                    List<Double> F2loop = new List<double>();
                    List<Double> F3loop = new List<double>();
                    List<Double> M1loop = new List<double>();
                    List<Double> M2loop = new List<double>();
                    List<Double> M3loop = new List<double>();
                    //IF THIS TABLE GETS REWORKED IN THE FUTURE, THIS WILL NEED TO BE RECODED!!!
                    SectionResults loadCaseResults = new SectionResults();
                    //AREA OF INTEREST

                    //note index is the corresponding loadcase selected in the LoadCaseList
                    //This is Eq X, Eq X e+ etc.
                    loadCaseResults.LoadDirection = OrderedSeismicList[i];
                    //We generatate all of the results for an individual load case, load step in this loop here.
                    foreach (int sectionResult in specificItems)
                    {
                        F1loop.Add(Convert.ToDouble(TableData2[sectionResult + i * 14 + 4]));
                        F2loop.Add(Convert.ToDouble(TableData2[sectionResult + i * 14 + 5]));
                        F3loop.Add(Convert.ToDouble(TableData2[sectionResult + i * 14 + 6]));
                        M1loop.Add(Convert.ToDouble(TableData2[sectionResult + i * 14 + 7]));
                        M2loop.Add(Convert.ToDouble(TableData2[sectionResult + i * 14 + 8]));
                        M3loop.Add(Convert.ToDouble(TableData2[sectionResult + i * 14 + 9]));
                    }
                    loadCaseResults.F1 = F1loop.ToArray();
                    loadCaseResults.F2 = F2loop.ToArray();
                    loadCaseResults.F3 = F3loop.ToArray();
                    loadCaseResults.M1 = M1loop.ToArray();
                    loadCaseResults.M2 = M2loop.ToArray();
                    loadCaseResults.M3 = M3loop.ToArray();
                    listSectionResults.Add(loadCaseResults);
                }
            }

The problem with this? All the data that is returned is in this massive list/array. I had to write a lot of code to pick apart this massive response, a big bummer from the super clean AnalysisResults.SectionCutAnalysis method used before.

Pro tip, use the view command in visual studio for a variable explorer! So sad I did not know about this until about a week ago. A huge time saver!

The same issue arose with trying to find out what sort of seismic load cases had been generated. The ETABs API developers seem to have made a cognizant decision to move away from specific methods and move towards the database table responses for grabbing all data.

There used to be an autoseismic interface that appeared to work back in the day

but now, these functions do not return anything meaningful. I had to go to database tables to retrieve this info, more unorganized data to sift through and organize myself.

 if (CodeName == "ASCE 7-16")
 {
     string myTableKey = "Load Pattern Definitions - Auto Seismic - ASCE 7-16";
     string[] FieldKeyList = null;
     string GroupName = "";
     int TableVersion = 0;
     string[] FieldsKeysIncluded = null;
     int NumberRecords = 0;
     string[] TableData = null;
     _SapModel.DatabaseTables.GetTableForDisplayArray(myTableKey, ref FieldKeyList, GroupName, ref TableVersion, ref FieldsKeysIncluded, ref NumberRecords, ref TableData);

Another pain point, why do the database tables for auto seismic not correspond to the section cut force analysis tables??? More coding required to dig through all this.

Lastly on the coding side, the plots are now interactive if you click on the row in the datagrid.

The interactive plots with highlighting gold selected rows was achieved through this item in live charts called "mappers". After much coding and interactions with chatgpt, I was able to get a mapper to work with the omitted selected row in the datagrid.

In addition, I attempted to add some code to check if the analysis had already run and skip the tedious rerun if required. I was able to use this function, _SapModel.Analyze.GetCaseStatus(ref NumberItems, ref CaseName, ref status);

to check if the load case had run. Unfortunately though, the section cut definition table is not available if the model is locked and I have a question into CSI regarding this. It seems like you should be able to add section cuts while the model is locked. More on this here.

Uses with the tool

Now that the tool can plot diaphragm shears for code mandated mass eccentricities, I want to run some sample cases to test some of my old design philosophies, for example, due to computational time constraints, I would often design my diaphragm shear DCRs to be a max of 90% to account for the +e/-e mass eccentricities. From my quick tests, this may not have been as conservative as I thought it was.

 

1 2 3 Next Last

Sidebar

Site Info

  • Latest Comments