The data modeling aspect of implementing the Future Airborne Capability Environment (FACE) Technical Standard continues to be a challenge for numerous organizations. The process is considered by many to be too difficult or intractable for new programs. Where past publications have discussed the mechanical difficulties of developing a data model, this paper illustrates some of the more philosophical and structural aspects of building a data model.
These concepts are introduced using an outlandish story that follows an engineer through the process of developing a data model. Along the way, it introduces the myriad concepts that modelers will face when building FACE™ data models. It also notes a couple of developmental milestones with an indication of the sufficiency of the model.
While this paper could be more concise and much more technical, abundant artistic license is taken to produce something that might be readable in a single sitting and that may prevent the premature death of an e-book reader succumbing to a sudden flood of sleep-induced drool. It is intentionally over-the-top. Data modeling is challenging topic and we are doing our best to have a little fun with it. While the technical aspects are intended to be taken seriously, the rest of the paper is not. Enjoy.
Day 1 - Introduction
Legends told of a man – nay, a hero – who towered three whole heads above all others. Every aspect of his existence was beyond compare.
One time, the entire engineering bay had tried to figure out when he arrived at work…and when he went home. It was as if our hero had a sixth sense. When someone got the idea to come in early – hero would mysteriously arrive earlier. When the team stayed late, he was the one who shut down the lab and turned out the last light. And then there was the time he discovered when the engineering team started a pool to decode his work schedule, subverted the entire effort, and won the kitty. One night, while the team went out to grab sandwiches, he programmed the lab robot with a facial recognition algorithm to greet everyone by name and hurl comical insults at them.
The hero, they said, never turned down a challenge and never lost a bet. Thus, when the company’s first data architecture project arrived, our fateful hero was chosen to bring his might to bear and build the data model. He graciously accepted the project, and rapidly consumed every book and article written on model-based systems engineering. He even sent an electronic copy of the FACE Technical Standard (Edition 3.0, of course) to a local printer to have it hardbound just so he could tell others that he literally read the standard “cover-to-cover.” And he did. Twice. He even read a silly, self-contradictory (yet oddly self-consistent) whitepaper about how hard it is to data model.[1]
“Drivel,” he thought and just as quickly dismissed the paper.
Fortunately, although not much was actually explained about the process, the FACE Technical Standard was very clear about where to start data modeling – the Shared Data Model. Per the standard, this model “provides the core extensible elements from which USMs are built.”
Hitting his favorite search engine (who are we kidding, our hero had developed his own search engine – 46696e64.com – that he hosted on an top-of-the-line BitCoin mining rig which he also built in his spare time), he searched for “FACE Shared Data Model” and was instantly rewarded with a link to the publicly released model on The Open Group’s website.
After downloading the FACE Shared Data Model from The Open Group’s public site, he opened the file to start reading it.
At this point, it should be mentioned that heroes of this caliber do not use the text editors of mere mortals. In a pinch, they might use something like Notepad++, but heroes of near immortal status have an ultimately powerful tool – emacs with a full complement of scripts and a complex assortment of key-combinations[2] with outrageous names like: Klingon death grip, iron hook and tap, the dynamo, the double-triple, meta-x-spook, and The Constable.After unleashing this secret weapon upon the shared data model, he looked upon the screen. What he felt in that moment, no one would ever know. Those looking upon his visage swear they saw fear. Some say they saw a single bead of sweat form on the edge of his brow. Others say it was a tear.
The lore that surrounds this very moment was memorialized with a simple plaque in the shape of a circle. In its very center, there was a dot and around the edge, the date was encoded – in binary – in the proper (and only logical) format day-month-year. Every single person in the lab (and even the lab puppy – which was a Labrador retriever – so yes, the lab’s lab) swore that the plaque was a circumpunct and represented how everyone rallied around the hero. Everyone knew, however, that it was a tear viewed from the top, as it – pulled by gravity – hurtled toward the floor to form a teeny tiny splash that, despite the minuscule amount of moister, washed the hopes and dreams of the team away for nearly two months.
Yes, friends, that day, the hero felt something he had never felt in his entire life. Was it fear? Doubt? Uncertainty? To be fair, not even our hero was sure because this was such a foreign experience. So, he did what he always did when confronted with discomfort: he took a step forward. He started studying the model – after all, how hard could a little XML be?
After a couple of minutes (and a couple of elisp macros to help navigate), he was comfortable navigating around the model, but that didn’t make it any easier to understand.
The Shared Data Model (or “SDM” as the professionals call it) is divided into two main sections: the Conceptual Model and the Logical Model. It was easy to discern between the two segments of the model since each element was labeled with either “conceptual” or “logical.” Furthermore, the model was organized by sets of opening and closing tags (after all, this is XML) which gave the model a hierarchical organization.
In the conceptual model, he found a number of “observable” nodes. This type of element only existed in the conceptual level. These elements were intended to correspond to physical phenomena that could be, well, observed. Despite have a reasonably complete set of physical things that can be observed – a list which contained everything from distance to speed to absorbed dose and even weird things relating to radiance – that is all it seemed to be. It did not lead to anything else.
This is a data model? He quickly filed the thought away and continued inspection.
Hero spent some time reading the definitions of the observables. It seemed as if the definitions provided some much-needed clarification on the meaning of the observables. The labels themselves were not descriptive enough, so, he reasoned, the ultimate meaning must come from the name and definition.[3]
Having read the standard three times, our hero knew that that he would find much more information about the observables in the Logical Model. This was one of the so-called variation points that the architecture was intended to allow. It just makes sense. And, to our hero, it did. But to the bystanders, to the mere mortals who were not privy to the inner workings of his mind, it was not nearly as obvious.
The structure of the data model was intended to allow software interfaces to be documented simultaneously at three different levels of abstraction. The most abstract representation is the Conceptual Model. This eponymous model is merely full of concepts. The concept of position is represented merely as the all-encompassing idea of position. The Conceptual model does not detail the position beyond the definition provided by the observable and the context in which it is used.
The Logical Model is a little more specific. It describes how the different observables are measured. Thus, things like coordinate systems and units are combined to define measurements. A measurement might define the fact that a Temperature (this is another SDM observable) can be measured using Celsius, Fahrenheit, or Kelvin. By accurately documenting this information, one can ensure that the proper units are used to express the value in an interface. And, since different systems can use different units, the measurements provide an acceptable variation point for developing a system using a data architecture.
The most concrete level of the data model is called the Platform Model. This model redefines the data in the way it is presented on-the-wire. Computers communicate in bits and bytes – not concepts. However, as most programmers have seen, some applications may represent a value as a float while others represent it as an integer or even a string.
Figure 1 – One concept may be represented in many ways at different levels of abstraction
So, just like our hero said, it just makes sense. In spite of the four times our hero read the technical standard, he printed out Figure 34 on Page 328. It was the key to understanding the logical content. (Yes, ladies and gentlemen, figure 34 is the equivalent of the Little Orphan Annie Secret Decoder Ring for the Logical Model. While it will most certainly not tell you to “Drink more Ovaltine,” it will make the clear the enigma that is the logical model.) This diagram shows all the relationships between the different elements in the logical level.
Hero looked around. The lab was quiet…everyone had gone home. As he stood up, he noticed something on the floor, so he leaned over to inspect it. Odd, I don’t remember eating anything purple and gooey today.
Those who have spent any time data modeling know what that purple goo is, but our hero was just beginning his journey.
Day 2 – Forming
Ideally, the second day of data modeling would have started after the day’s first cup of coffee, but things did not quite happen like that. Many times throughout the night, our story’s hero awoke in a cold sweat. Each time, he found specters of metamodel diagrams stuck in the corners of his mind extending their orientation attributes out to drain the life out of any wary modeler who dared to get too close.
At some unholy hour of the morning, long before the sun contemplated breaking the cool of the night, while headlights were not only a good idea, but an absolute necessity, our hero returned to the office.
Today, he would endeavor to build a data model.
After logging into his Linux boxen and firing up emacs (okay, it was Xemacs – and our hero still ran Window Maker because it was, as he said, “wickedly fast”), he loaded up the SDM and endeavored to create an entity.
Reaching into his project archive, he found the Interface Control Document (ICD) for the definitions of all the interfaces his system needed to implement – Open Stovepipe 42.
Hero turned to the first message:
Table 1 – Definition of fictitious Open Stovepipe Message 1
As a quick aside, it should be noted that our hero didn’t simply start hand coding the data model in long-form XML. Although it would have been well within his technical ability, our hero was eager to make progress as quickly and easily as possible. So, he challenged himself to develop a custom data modeling language than ran completely within emacs. However, he didn’t start the challenge until his first colleague arrived and was just working out his last bug as the last of the team members trickled in. Challenge complete. Elapsed time: 27 minutes.[4]
At this moment, our hero had to slow his typing and steady his pulse. The vein in his forehead pulsed in perfect synchronicity with his finger tapping the well-worn space above the up-arrow key on his keyboard. What came next? The documentation captured all three levels of the data model – Conceptual, Logical and Platform. In the description, it clearly indicated that this field was a position. The units made it clear that this was measured in meters, and the platform type should be a double.
So, clearly all three data models needed to be built.
Perhaps he needed to rethink his approach, but for the moment, our hero decided to focus on building the conceptual model. After all, Rome wasn’t built in a day!
Satisfied with his effort, the hero moved on to the next message. This might not be too hard, after all!
Following the precedent established by his previous example, our hero attacked his keyboard with zeal, creating entities and adding attributes until he completed a model entry for the second message. However, as our hero tapped in these commands to build the data model, something started bothering him. Somewhere, deep in the recesses of his mind, our hero started to feel uneasy. Thinking this was merely a result of too much coffee (remember, it was a very early morning), he quickly dismissed the feeling and pressed on.
Having read the standard five times, hero was getting pretty familiar with the list of observables. As such, this made it all the easier to build the entities for his data model.
Since all good things come in threes[6], the author decided to have our hero develop one more entity.
Easy peasy lemon squeezy.
Just like the last time, our hero quickly entered the data model content just as he read it from the ICD. He paused a little bit as he came to speed.
An attribute of type speed named speed. Seems redundant. And…interesting…they represent position differently in the command.
He felt something lightly tap his right shoulder.
When he looked around to see who was seeking his attention, no one was in the lab. They had all come and gone – mostly without notice. He was left there alone with a growing sense of foreboding.
Another day was complete. As he shut down the lab, he noticed another little blob of purple goo on the floor beside his chair.
Day 3 – Storming
Although sleep came a little easier that night, it was punctuated with countless neurochemical attacks. In one battle, our hero dreamed that he was an octagon block trying to work with the classic, Tupperware shape sorter. The problem, however, is that the shape sorter does not have an octagon slot – the hero did not fit anywhere. As soon as his mind realized that he was neither a circle nor a hexagon, his dream-form shape started to flatten and change to a bright, firetruck red. With a perspective only provided by an unconscious brain drunk on serotonin can provide, our hero realized he was now staring at a stop sign. Then, like one might imagine a digital amoeba to reproduce, the foreboding octagon started cleaving and multiplying into a sea of red stop signs and then, finding the smallest of all the octagons, hero’s viewpoint zoomed in on until it filled his entire field of view.
He awoke with a start and, at that moment, the reality that our hero had earlier dismissed yesterday as bad Mexican food (or too much coffee) caught up to him with the rapidity of the dolly zoom paralyzing Jimmy Stewart with a bad case of vertigo. His dream was trying to tell him to stop because something did not quite fit.
Unlike the day before, it was the normal time for hero to get ready for work. And, as with every day before, he arrived in the lab before anyone else.
Seeking reasons he should heed the warning provided by his subconscious, hero endeavored to evaluate his data modeling prowess by testing the model in the Conformance Tool Suite[7]. In order to integrate the CTS with emacs, hero decompiled every single JAR in the CTS distribution and looked for files with “main” functions. It didn’t take long before he happened upon the most useful text:
Score! The CTS user interface could be skipped and a simple data model conformance check could be executed from a shell command.
And just as quickly as it has begun, it was over. Alas, there were no smiley faces, no happy emojis, just an exit code and very, cold and stoic message:
As any developer as seasoned as our hero knows, an error might just be the tip of an iceberg. So, he opened the log for further inspection.[8]
Hmm, this is most unexpected. Having read the tech standard six times, he did not recall any mention of a constraint requiring a unique identifier in a data model entity. So, hero did what any geek of his stature would do – he opened the OCL constraints for more information.
Buried one layer deeper in the CTS, there was a directory called “constraints” that had a collection of OCL files. Since he was working in the Conceptual Model, hero opened the file called “conceptual.ocl” (in emacs, of course) and looked for “hasUniqueID”. Just a few dozen lines down, he was rewarded with a description:
While there was definitely an observable named “Identifier,” the addition of this attribute would make his data model look different from the ICD. With a growing sense of apprehension, hero inserted an identifying attribute into each of the entities and tested the model once more.
That zero had a steeling effect on our hero. At that moment, he let out a subtle sigh of relief – the Titanic is safe – and resolved to build out the Logical and Platform Models this day. Since the Logical and Platform models were generally just a type-substituted version of the other models, they might, he posited, even be automatically generated once the measurement and primitive types are selected.
When he had looked up the details of the OCL constraint, he had noticed a number a files with interesting names, but one specifically caught his eye, observableuniqueness.conceptual.ocl. This file was identical to the other conceptual.ocl file with one exception: there was one extra constraint:
An Entity does not compose the same Observable more than once.
Reviewing the requirements given to him by his boss, our hero saw one that he had missed: The data model shall be constructed using observable uniqueness. Ah, so that’s what this means, but how did my model pass the validation checks?
Remembering that there were a couple of other command line options for running the model validation tool, our hero ran the application without any parameters, and it listed all of the options. Not only did he find a switch for “observable uniqueness,” there was one listed for “entity uniqueness” as well.[9] Fortunately, he did not have a requirement for entity uniqueness.
Hero reran the CTS to test evaluate the model with observable uniqueness enabled, and, sure enough, all three entities were reported to violate this constraint.
Of course, they have three positions, orientations, and orientation velocities.
But then, he remembered. The Conceptual Model captures the concept: the Logical Model captures how that concept is measured.
Looking back at the entities he modeled, he recognized his mistake – especially when contrasting the vehicle pose and vehicle position command messages. Hero felt another tap on his shoulder: just like yesterday, no one was there when he looked.
Vehicle Pose reports the position in X, Y and Z coordinates. The concept is not X, Y and Z. That is merely how the position is represented (or measured) using one particular type of measurement system. In this case, the origin represents the center of the Earth with the positive x-axis through the equator and prime meridian, the positive y-axis though 90˚ east longitude and the equator, and the positive z-axis through the north pole.
Vehicle Position Command, on the other hand, uses latitude, longitude, and altitude coordinates to represent the concept of position. As with X, Y and Z, the concept is not latitude, longitude, or altitude. In this case, the origin is the intersection of the prime meridian and the equator at an altitude of zero.
The thoughts were flowing so quickly, the ideas, the principles, the concepts of data modeling were engaging inert neurons, dormant dendrites, and ambivalent axons. Cascades of thoughts were flowing as more synapses formed. His head hurt. There was another tap on his shoulder: again, no one was there.
Revisited the entities previously modeled, our hero was able to substantially simplify the model structure.
Rerunning the CTS, our hero was rewarded with that coveted status message:
Right then, our hero realized the Skye was licking a spot on the floor beside him. Curiously, he watched for a few moments until the dog ambled a few steps away. The was a small purple smudge left behind. There’s a lot of purple around here lately. Instead of returning to sleep over in Jen’s cubicle (Skye always sleeps in Jen’s cubicle – some say it was her Australian accent, others say it was the soft blanket she kept on her floor), he plopped down a couple of feet away, and stared at our hero, occasionally licking his chops.
It seemed – to our hero – that the model was getting better. Unfortunately, it was starting to look a lot less like the ICD than he had originally envisioned. Knowing that the Logical and Platform Models existed to provide such variation left him little comfort, but he pressed on regardless. The ICD made the type selection for the platform layer abundantly easy. Doubles and floats were the order of the day. Using his software tool, he registered the use of doubles and floats for platform types and moved on.
Choosing the measurements would only prove to be slightly more difficult. As with the primitive types, the ICD clearly indicates the units. Since he had read the standard seven times, our hero knew there was more to choosing a measurement then merely selecting units. Instead of relying solely on units, he looked for all of the measurements that used the observable he needed. And remembering that the Logical Model was the variation point for how an observable was measured, our hero started by choosing the appropriate measurements for each of the attributes.
For Vehicle Pose, he chose Position_XYZ_Meters_Meas and for Vehicle Position Command, he chose Position_WGS84_LatLonAlt_Radians_Meas[11].
It was finally time for our hero to ponder the creation of the query and template. He could simply include all of the attributes from the Vehicle_Position entity, but he could not do this both in the query and the template. If he included all the attributes, that would also include the identifier attribute – and that was not part of his message.
On one hand, he could choose to select everything in the query and narrow the scope with a more specific template. On the other hand, he could narrow the select and have a very generic template. Or, he could narrow both!
Although the query and template work together to specify the application’s interface, the template really carries the burden of ensuring the data is properly formatted. Tracking with that thought, our hero decided to use the former approach – use a broad query and get specific in the template.
And then the template:
Ha! Take that, identifier!
Once again, our hero prevailed over a technical challenge before him! Despite the modeling requirement to include an identifier in the entity, the query and templates allowed him to, effectively, filter that information out and present only what was necessary. This gave rise to another fleeting thought that, for some reason, hero synaesthetically equated to a stop sign: I wonder what else I could filter out!?
Once again, the lab was empty. Our hero had a vague recollection of Skye padding out an hour or so ago, but nothing since then. He was too deeply ruminating over his data model. With another day in the can, our hero headed home.
Day 4 – Norming
After shaving, he grabbed a cotton swab to clean out his ears and as he went to toss it into the trashcan, he noticed it was covered in purple. Running over the laundry basket, he retrieved the shirt he wore yesterday and inspected the right shoulder. There, faintly, were several different purple dots.
I need to wrap up this data modeling, it’s causing me to leak brain fluid again.
As our hero grabbed his morning coffee and headed for the door, his phone rang. “Hello.”
“Are you okay?” a familiar voice asked from the other side of the connection. What an odd question, but given the purple goo, I might have some concerns – not that I expect anyone else to know about that.
As our hero was forming an appropriate response, his phone beeped – a call on the other line. “Hold on a second.” And he switched to the other line. “Hello?” This time, his answer was a little apprehensive.
“Morning, uh, are you ok?” Wow, this purple thing must be a much bigger deal than I realized.
He quickly cleared both lines, telling his colleagues that all was well. It wasn’t until he arrived at the lab that he understood their concern. When he walked in, everyone was already there – and appeared to have been for quite some time. Having received his assurances that he was, in fact, okay, he simply shook his head, as if to clear the morning cobwebs, and slid in behind his computer.
As soon as our hero sat down, Skye ran up beside him, sat down, and looked at him expectantly with her tail joyfully wagging. Something else just clicked in our hero’s mind. Ew. Gross.
Having comfortably settled into his chair, our hero spent the rest of his day working through the remainder of the Open Stovepipe ICD – developing entities for each of the different messages. However, unlike his first attempts, our hero made sure to eliminate the logical representations in the conceptual model.
There were times when he was not entirely sure which observable to choose. When that happened, he merely consulted the data model. This was a particular bit of insight that rather pleased him. In some cases, it was easier to choose the measurement than it was an observable. After all, despite his impressive capabilities, our hero was a developer – not a physicist. Although he might not know the difference between magnetic flux density and luminous intensity, it is possible to find a measurement for the appropriate units (as specified in the ICD) and work backwards through the shared data model to the observable.
At several points throughout the process, he got the sneaking suspicion that he could be doing something better, but precisely what that was continuto elude him until he came to the Vehicle Geo Pose message.
Following his established process, our hero entered the data for the message. As he stared at the result, a powerful sense of déjà vu overcame him. This was exactly like the first entity.
He sat there a bit longer contemplating the two messages. There were several taps on his shoulder, and he detected the dog getting ever more excited. At one point, Skye might have even started licking his ear, but he was so lost in thought, it didn’t really even register on a conscious level.
What is the difference?? Do they really have the same message? Vehicle Geo Pose. Vehicle Pos. The difference is…well…Geo.
And then, like a flash, it hit him. Although his process was good about reducing logical assumptions from the attributes, he had not considered the possibility of logical content having been baked into the entity itself. This was the same entity as Vehicle_Pose – there was no reason to duplicate it. The proper message could be created simply by making different logical choices.
As another profound insight teased at the edge of his mind, his ear started to itch and then a loud “bang” shattered his concentration. What his eyes found as they focused on the scene was both tragic and hysterically funny. Apparently, something had caused Skye to dash across the lab and crash into Jen.[12] Our hero noticed something purple on the Australian’s face, just as Skye seemed to lick the last bit of it away. Gross.
As the excitement abated, our hero started to think about his second entity. Aside from the identifier (because everything has to have one of those), Vehicle Pose and Vehicle Attitude shared half of their properties. Would it be possible to eliminate another entity just like he had with Vehicle Geo Pose?
Not exactly. There was no logical content encoded into the name of the name of Vehicle Attitude. But…the identifier doesn’t show up in the message. Could I do the same thing with orientation???
On a whim, our hero tried combining the two entities. The idea seemed to hold together. After all, the orientation velocity was another aspect of the…
More clarity. The position, orientation, and orientation velocity were all aspects of the concept of a Vehicle. They were not concepts of the Vehicle Pose. These messages were merely different ways of looking at the same concepts. Different units, different primitive types, different combinations of attributes to send at different rates, different on-the-wire encodings. The actual data model did not need to reflect the ICD, that is why the Queries and Templates exist.
There were countless variations of how to view this information, but there was a finite set of core concepts the ICD was talking about. And with that set of insights, our hero pivoted his approach from developing a message model to developing an entity model.
As our hero worked back through his effort up to this point, he ran across many cases that could be reduced and many that could not. There was one entire section of the standard that communicated most of the same concepts of a Camera. However, instead of collapsing these into a “Movable Thing” entity, it made sense to keep them separate since they seemed to be two, distinct concepts. (There’s that word again.)
Epilogue
After a couple of short weeks, that strange sense of foreboding dissipated, and our hero was feeling like himself once again. And, he was feeling quite proud of his efforts having developed a reasonable entity model. Observables flowed into measurements which were beautifully constructed from coordinate systems and value type units. The model exported successfully – finally – and the conformance tool suite passed his model with flying colors.
The hero stood, positioned himself in the middle of the developers’ bullpen, and announced that he had mastered the data model. Now that it was finished – ahead of schedule – he was going to petition for a week off for the entire engineering team. With a flourish of grandeur, he patted himself on the back, and returned to his desk.
Unfortunately, no one could understand what he said. Had they been able to reference a data model of his brief speech, it would have been immortalized as the clearest and most unambiguous speech ever given. However, that was not the case and the literal speech was something more like, “data model calendar time in the context of task in the context of contract has relationship to employee dot vacation.”
At one point, one of the mid-level engineers looked over at his neighbor and said, “he used to be taller, didn’t he?” The neighbor just shrugged and The Circumpunct fell from the wall and clattered to the floor.
Summary
In our story, continuing revelation (and the fictional magic of time dilation) allows our hero to iterate over his data model several times. The use of the CTS in the story shows that the model was acceptable at couple of points throughout its development.
For the sake of using the model as a conformance artifact, a very simple model can be produced. This can be something like the first model our hero created. The second attempt (wherein the Logical concepts were reduced from the Conceptual Model) led to a better model allowing reusability with different logical choices (e.g. measurements).
In the initial attempts, the data model was a direct reflection of the partitioning of the data in the messaging standard. Although sufficient, it does not generally lead to a reusable product since other messaging standards combine information in different messages. Thus, the two standards are not likely to map to the same model entities.
The final attempt represents the approach with the greatest likelihood of reusability. In this case, the domain entities are modeled independently of the system interfaces (messages). The only way to document an interface is by composing the data from multiple entities. This approach will accommodate countless messages standards (or interface designs) since the conventions of the standards are not baked into the model.
Data models have utility far beyond mere conformance. In ideal conditions, they can be used to automate the development of many aspects of integration software. Of course, it is not always within scope or necessary to build such a rigorous model. Project, customer, and contract goals will likely drive and govern the necessary rigor.
One may wonder why the development team was not involved in the story. For the most part, the developers are concerned with implementing the system interfaces. In this paper, the Open Stovepipe messages were used to represent these interfaces (though there is no reason messages should equate to interfaces, but that is another matter for another forum). Since these messages were already fully specified, there was no requirement for any interaction between the modeler and the developers. The development of the data model could progress independently of application development.
Finally, although the software tool described in this paper is fictional, it is a completely realistic type of tool to develop. As discussed in previous papers, the difficulty of data models should not stem from tooling problems. The challenge of data modeling has to do with organizing the domain’s concepts into a structure that yields both accurate and readable documentation.
[1] Unfortunately for the real world, this paper does exist and you can read it here: https://www.skayl.com/post/data-modeling-is-hard-vs-data-modeling-is-hard [2] These were referred to as the “Grips and Scripts” until one unfortunate evening after the SWAT team arrived thinking they were about to bust a violent and angry street gang. [3] This is the intent of the observable definitions. [4] One should not feel inferior to Hero’s speed. He already had the skeleton and parsing capabilities, he just needed to tweak his scripts to read the emof provided in the tech standard. [5] This is a purely notional and fictitious modeling tool and not a clever marketing ploy in disguise. [6] The author hereby submits as evidence: The Three Amigos, Three’s Company, The Three Stooges, three ring circuses, rock-paper-scissors, and The Matrix Trilogy. (Oh right, scratch that last one.) [7] While it is far beyond the scope of this paper to explain the configuration and use of the CTS, it is completely within reason to explain a shortcut that makes this who data model evaluation process easier. [8] Output was slightly edited for purposes of formatting and presentation. [9] This paper is not going to tackle entity uniqueness, but the principles are similar. Essentially, there should be no two entities defined with the same set of composed attributes. [10] This list has been modified for the sake of the paper. There are a number of other position measurements in the SDM v3.0.4. [11] You are probably wondering about the altitude. Why didn’t the name of the measurement mention meters or feet? There really isn’t a reason – it is just sort of omitted. That’s okay, though. Since our hero read the standard eight times, he knew that he needed to check the value type units on all of the measurement axes (and possibly the measurement system axis) and verified that Position_WGS84_LatLonAlt_Radians_Meas used meters for altitude. [12] Hey, I worked really hard to weave this joke into the paper. I really hope it hit. If you didn’t get it, gather some of the related words and search for them together with the year 1979. Too much effort? Okay, last hint: Skye, the lab, crashed into the Australian.
References
About the Meta Object Facility Specification Version 2.5.1, 2020, www.omg.org/spec/MOF/.
C. Allport, G. Hunt; Data Modeling is Hard, 2017; https://www.skayl.com/post/data-modeling-is-hard-vs-data-modeling-is-hard
S. Hand, D. Lombardi, etal; Interface Description Maturity Levels, 2018; https://www.skayl.com/post/interface-documentation-maturity-levels-idml-an-introduction
AUTHOR
Chris Allport
Chris Allport is a co-founder of Skayl. Allport is a specialist in realizing concepts into advanced prototypes. He has a proven track record with numerous aspects of aerospace industry: from leading the development of standards to hands-on software development and integration. Throughout his professional career, Allport has been called upon to lend his expertise to myriad interoperability activities. In earlier years, these efforts were met with simple, point solutions, intended to solve the problem at hand. However, as the years have progressed and challenges more sophisticated, he has had to develop more sophisticated solutions to a variety of integration challenges. Allport earned a BS in Electrical Engineering, a BS in Computer Engineering, and an MS in Electrical Engineering from West Virginia University.
This paper was published as a part of the June 2020 The Open Group FACE™ & SOSA™ TIM
Comments