Building Blocks, Foundations, & Enterprise Architectures

Languages (spoken, visual, mathematical, etc.) exist because they are the building blocks for communication, understanding, and ultimately, relationships. Relationships form the foundation for social networks, communities, strategic partnerships, and more complex systems. These systems, and the interaction of within and across such systems, is a basis for life and living.

The problem is, the definition and conceptual understanding of these building blocks, foundations, and higher-level systems often does not exist. As a result, technology development efforts, strategic partnerships, marketing campaigns, and the like suffer from a lack of true coordination and comprehension.

In general, identifying building blocks, establishing foundations, and defining more complex systems and interactions is critical to advancement in this world. In most cases, establishing these foundations is a much needed platform for coordination and comprehension that supports achievement of a higher objective. In other cases, attempting to define abstract concepts and inherently complex systems is a fruitful exercise in itself, driving constructive debate, new questions, and lessons learned for the primary stakeholders involved.

With this in mind, I seek to outline some building blocks and establish a simple foundation for enterprise architectures. My hope is that by initiating this exercise, it may provide some conceptual clarity to non-technical folks and demonstrate a framework through which other systems can be defined and explored.

The Building Blocks of Enterprise Architectures

In general, an enterprise represents people, information, and technology joined by common needs, objectives, and/or behaviors. An enterprise architecture helps define the structure of the enterprise to enable the people, information, and technology to interact in an efficient, effective, relevant, and sustainable manner.

  • People – Represents individuals or the various organizational constructs that contain individuals, such as a program, agency, domain, or community of interest.
  • Information – Represents all consumable data, products, and knowledge that is collected or created by other elements of the enterprise.
  • Technology – Represents the infrastructure components, networks, capabilities, systems, and programs that support other elements of the enterprise.

The Foundation for Enterprise Architectures

Now that the puzzle pieces have been broadly defined and we have a simple lexicon to work with, we seek to: (1) outline how these building blocks might fit together to support various operational needs, analytical use cases, and other tasks/functions; and (2) identify the logical connections, interactions, processes, and/or relationships between and amongst the building blocks.

The diagram below begins to define this foundation, logically placing enterprise elements (people, information, technology) to support coordination and comprehension. This would then support the examination of each possible pair of building blocks (e.g. people and information) to define the enterprise architecture and identify critical interdependencies within the system.

Enterprise Architectures: Technology Focus

To this point, establishing definitions and diagrams provides us with a core foundation for understanding end user requirements, identifying security implications, pinpointing system interdependencies, and supporting system analysis efforts. Focusing in on the technological components of our enterprise architecture, we have categorized them into three logical tiers:

  • Top Tier (Front-End) – Represents the technologies that support end-user interactions (data access, analysis, visualization, collaboration, input, personalization, etc) with information/data and other stakeholders.
  • Middle Tier – Represents the utilities, services, and support components that optimize system interactions amongst all people and information.
  • Bottom Tier (Back-End) – Represents the core information architecture, system security, and access / identify management components to support a secure, efficient, and effective operation.

The bottom line is that defining building blocks and outlining foundations is a critical first step to support coordination and comprehension. Sometimes just putting words and diagrams on paper saves valuable design and development hours or at least drives valuable discussion. Particularly in the world of enterprise architectures, this process is critical to align stakeholders up front and to put development efforts in perspective. Whether it’s boxes, lines, definitiosn, or discussions, sometimes a little language goes a long way.

Advertisements

Ten Technological Concepts Compressing the Analytical Timeline

Today’s difficult economic climate continues to cause increased competition for all organizations. Shrinking budgets are placing government departments and agencies under more pressure to increase operating efficiencies and cost-effectiveness of programs and technologies. Across industry, fragile markets have caused organizations to consider the need for every project, person, and printer to reduce operating costs. In the non-profit sector, slimming funding streams have caused an increased pressure to demonstrate value through concrete, measurable results.

In order to stay competitive within their particular domains, markets, and user communities – and to ultimately achieve growth and sustainability in any economic climate – all organizations must find ways to increase operating efficiencies, eliminate programmatic redundancies, and produce measurable results. Lucky for these organizations, several technological concepts have emerged over the past decade which help support these practices. In that regard, the acknowledgement, understanding, and implementation of these concepts across organizational units, programs, and processes will compress the analytical timeline and allow organizations to learn, control, adapt, and anticipate over time.

Here’s a quick look at some of the technological concepts/trends that are compressing the analytical timeline, allowing organizations to act on insights more quickly, more effectively, and more accurately:

  1. Data Collection Mechanisms – It’s not just about collecting more data, although volume (in many cases) helps. It is about collecting more types of data (image, audio, video, open source media, social media) and collecting more tactical data. The growth of the mobile and tablet markets, the ease-of-use of such devices and their decreasing costs, and the expansion of mobile network infrastructure around the world are helping organizations collect more diverse, tactical, and (ultimately) valuable data.
  2. Data Cleansing/Processing – Rather than ignoring unstructured data, we are beginning to embrace it. Many COTS, GOTS, and even open source technologies exist that cleanse and process unstructured data to ensure it can be used to support relevant use cases. Where unstructured data was formerly omitted from the analytical landscape, these technologies are now bringing new value and context to insights and decisions. Within this I want to also add the data storage/warehousing and processing capabilities that support big data analytics and data mining, which provides a quicker means by which the vast amount of data can be combed for relevant patterns and insights.
  3. Logical Data Structures – It seems we are finally learning that a little thought and planning up front does wonders for the types of analysis needed to support operations research, performance measurement, marketing, and other organizational practices. By building logical data structures, we can quantify things otherwise unquantifiable and ultimately make timely, informed decisions otherwise made by intuition alone.
  4. Data Standards/Models – In conjunction with building supportive, internal data structures, we are beginning to understand how data models within domains, across communities of interest, and for specific problem sets can do wonders for our analytical practices. By developing and/or adopting a standard, we can bring consistency to these analytical practices over time, even through personnel changes. No more one-off studies/reports, but rather repeatable and communicable analysis.
  5. Data Source Registries/Catalogs – It is slowly being understood that ubiquitous access to raw data sets is far from a reality. However, organizations are beginning to realize that data source catalogs (registries) across organizational units and/or communities of interest is a step that can quickly facilitate more effective data sharing practices. Rather than focus on the exposure of raw data, the data source catalog first involves the exposure of data source metadata – information about the data, but not the data itself. This data sharing approach is more strongly rooted in trust and visibility and, ultimately, can provide a platform by which analysts can gain quicker access to more relevant data.
  6. Social Networks – The social network movement has done many things to compress the analytical timeline, to include, but not limited to: driving more collaboration and interaction between data owners, analysts, end users, and ordinary people; driving a new means by which more tactical data can be accessed and collected; and facilitating the development of new platforms, applications, and technologies to glean insights from data.
  7. Identity Management, Access Control, & Cyber Security – Knocking down stovepipes can support better access to data which in turn can support less time collecting data and more time analyzing it. However, stovepipes provide organizations with another layer of security to prevent data breaches. Despite this contradiction, better identity management, access control, and security technologies are being developed to maintain a high level of control while still ensuring users can more easily access data traditionally hidden within stovepipes. In turn, the time spent accessing and integrating data is decreased and individuals can spend more time analyzing disparate data and delivering quality insights.
  8. Cloud Computing – The movement of information systems and applications to the cloud is transforming the analyst from being a thick-client-loving info hog to being a platform-agnostic, collaborative participant. With more data and tools exposed to individuals, no longer constrained by a single hard drive or device, analysts can more effectively and efficiently access, collect, integrate, visualize, analyze, share, and report on data and insights.
  9. Network Infrastructure – The expansion of existing connected and wireless networks as well as the development of new, quicker, more accessible, and more secure networks will continue to compress the time it takes for analysts to provide valuable insights.
  10. Customizable & User-Defined Interactions – Allowing individuals to define how they wish to visualize, analyze, and interact with relevant data provides analysts with the ability to focus on developing solutions rather than setting up problems. The “user-defined” movement provides flexibility and adaptability to the individual and allows a wider set of individuals to become analysts by owning their own workspaces and interactions. It also provides an interactive medium through which results can be presented, making the reporting and dissemination process interactive rather than a drawn out one-way street.

I do want to note that this list is by no means comprehensive. Even more importantly, it only focuses on technological concepts and does not address the numerous cultural and political factors that affect the analytical timeline. Although technology shall continue to be a major focus area in supporting quicker and more effective analytical practices, it is the cultural and political aspects that will be more difficult to overcome and their interdependence on the technological aspects should never be overlooked.

National Information Exchange Model (NIEM) Practical Implementer’s Course Notes – XML Conceptual Review (Lesson 2)

NIEM Practical Implementer’s Course
Lesson 2 – XML Conceptual Review

Core Definitions

  • Elements: The tags that exist within an XML document, collectively termed the “markup”. Types of elements include root, parent, and child.
  • Attributes: Part(s) of an XML element that provide(s) additional information about that element. Attributes are defined and written as a name/value pair (e.g. name=”value” ).
  • Instance: A document containing XML tags and content that results from use of XML schema rules.
  • Well-Formed Instance: An XML instance is “well-formed” if it uses the correct syntax and structure as defined by XML standard(s) being used and meets the minimum criteria for XML parsers to read the document.

General Notes

  • Rules/Guidelines for XML Elements
    • Can contain letters, numbers, and other characters.
    • Must not start with number or punctuation.
    • Must not start with xml, XML, or Xml.
    • Cannot contain spaces.
    • Should be descriptive to contained information.
    • Avoid dashes, colons, and periods (allowed, but usually are reserved for namespaces).
    • Avoid non-English letters/characters (allowed, but may not always be supported).
  • XML Prolog & Processing Instructions
    • Prolog specifies the version and the character encoding used for the XML instance and should always come first in every document.
    • Processing instructions are used to associate presentation and/or transformation files with the data.
  • XML Comments
    • Start with “<!– ” and end with ” –>”
    • Can include linebreaks.

 

Note: Information is being shared under the Creative Commons Attribution-ShareAlike 2.0 (CC BY-SA) license. Original content was created by NIEM course instructors Jenness, Owen, and Carlson.

National Information Exchange Model (NIEM) Practical Implementer’s Course Notes – Anatomy of an XML Exchange (Lesson 1)

NIEM Practical Implementer’s Course
Lesson 1 – Anatomy of an XML Exchange

Core Definitions

  • XML: eXtensible Mark‐up Language used to define and serialize data as well as define schemas, transformation rules, web services and visual presentation.
  • Message: One or more XML documents containing the data to be shared.
  • Publisher: An entity / software program that initiates a “One Way” exchange.
  • Subscriber: An entity / software program that receives messages in a “One Way” exchange.
  • Requestor: An entity / software program that initiates a “Two Way” exchange.
  • Responder: An entity / software program that receives “Request Messages” and returns “Response Messages” in a “Two Way” exchange.
  • Web Service: A type of program that allows a remote system (client) to interact with a program on a local system (server) using XML messages.
  • XML Document (.xml): A file that contains actual data and conforms to the rules of XML syntax (also known as an “Instance Document”).
  • XML Schema Document (.xsd): A set of rules to which an XML document must conform in order to be considered “valid”.
  • Web Service Description Language (.wsdl): Pronounced “wiz‐dull”, a document (containing XML) that describes the functionality of a Web Service (like a “Service Contract”).
  • XML Stylesheet (.xsl): An XML document that describes how XML data should be visually rendered.
  • XML Stylesheet Transformation (.xslt): An XML document that defines the rules by which a file defined by one schema is transformed (mapped) to a file defined by another schema.

General Notes

  • One-Way (Two-Party) Exchange Pattern (Publish/Subscribe)
    • Messages are pushed by a publisher directly to oneor more subscribers
    • Messages can be transactional or batch
    • Messages are transport neutral (web service, FTP, email, etc.)
    • Messages are essentially “fired and forgotten”
    • Pattern is very scalable as publisher is insulated from diverse subscriber interfaces
  • Two-Way Exchange Pattern (Request/Response)
    • Requestor sends an XML message requesting specific information
    • Responder replies with an XML message containing the requested information
    • Typically implemented via web services
    • Response is typically synchronous (occurs at about the same time)
  • Federated Query
    • Single request message may yield numerous response messages
    • Not all respondents may have data for every request
    • Typically built using a “Message Broker” device, insulating requestor
    • Message Broker aggregates multiple responses to requestor

 

Note: Information is being shared under the Creative Commons Attribution-ShareAlike 2.0 (CC BY-SA) license. Original content was created by NIEM course instructors Jenness, Owen, and Carlson.

How Fast (Or Slow) Is The Speed Of Light?

A Little Background

The first recorded discussion regarding the speed of light was in and around 300 B.C. where Aristotle quotes Empedocles as theorizing that the light from the sun must take some time to reach the Earth. Almost two millennia later during the Scientific Revolution (circa 1620 A.D.), Descartes theorized that light was instantaneous. At about the same time, Galileo gave a more general thought that light was much faster than sound but not instantaneous, offering up some ideas as to how it might be tested using lanterns and telescopes. At what point would these theories actually be tested and how?

About half a century after Descartes and Galileo, the Danish astronomer Ole Römer began measuring the actual speed of light through observation of Io, one of Jupiter’s moons. He recognized that as the Earth and Jupiter moved in their orbits, the distance between them varied. The light from Io (reflected sunlight) took time to reach the earth, and took the longest time when the earth was furthest away.  When the Earth was furthest from Jupiter, there was an extra distance for light to travel.  The observed eclipses were furthest behind the predicted times when the earth was furthest from Jupiter.  By measuring the difference in time and using a little math, the speed of light could essentially be calculated.

From that point forward, numerous scientists tackled this quest through a diverse set of accompanying theories and experiments. The speed of light would be more accurately determined, leading to wide applications in optics, astronomy, and physics. For example, in the early 1900’s, the speed of light became a foundational component of Einstein’s theories (general and special) of relativity, proven to relate energy to mass (E=m*c^2 where c = speed of light). As a result of these applications, the calculation of the speed of light was a major platform for new scientific discovery and enlightenment.

So How Fast Is It?

Well, the measured speed of light in a vacuum is exactly 299,792,458 meters per second, often approximated as 300,000 kilometers per second (3.0 * 10^8 m/s or 3.0 * 10^5 m/s) or 186,000 miles per second. Outside of a vacuum where there might be atoms and molecules that act as impeding forces, the speed of light slows down based on the refractive index of the material. For a given substance with refractive index (n), the actual speed of light (v) is given by v=c/n where c is the constant speed of light in a vacuum. Of note:

v(air) = 299,704,764 m/s (n=1.0002926 at standard room temperature)
v(water) = 224,900,568 m/s (n=1.3330)
v(salt) = 194,166,100 m/s (n=1.544)
v(diamond) = 123,932,392 m/s (n=2.419)

Let’s put the speed of light, in air, in a bit of context…

The circumference of the Earth is about 40,000 km on average. That means that light could travel around the Earth 7.5 times in a second.

The distance between the Earth and its moon is about 380,000 km on average. It takes light about 1.27 seconds to travel from one to another. Click here for a demonstration.

On the size of our solar system, it takes light from the sun about 8 seconds to reach Earth, 43 minutes to reach Jupiter, and nearly 7 hours to pass the orbit of Pluto.

On the size of our galactic realm, the Milky Way is a spiral galaxy. Our solar system is located on what is called Orion’s arm, about 25,000 light years from the center of the Milky Way’s center. One light year is the distance light travels in one Earth year. In more earthly terms, that’s about (3*10^5 km/s)*(60 s/min)*(60 min/hr)*(24 hrs/day)*(365 days/yr) = 9,460,800,000,000 kilometers. And I thought a marathon was far.

Beyond our Milky Way galaxy and looking at our Local Group of galaxies, it extends about 4 million light years across. That means for light to run from a galaxy one side of our Local Group to a galaxy on the other side of our Local Group, it takes 4 million years. Yikes.

And our Local Group of galaxies is part of a larger “supercluster” that is 150 million light years across. The dinosaurs roamed Earth from 230 million to 65 million years ago. In other words, light from the Ursa Major and Virgo galactic clusters still hasn’t reached us if it was emitted during the extinction of dinosaurs. Makes light seem pretty slow now, no?

Whether quick relative to earthly distances or slow through vast cosmic voids, even light has meaning. It provides perspective, foundation, discovery, and well, light.

For more on the speed of light and the depths of the universe and time, I highly recommend Bill Bryson’s A Short History of Nearly Everything (it’s my favorite book).

Technology And Intelligence In The Next Decade

The below is an essay I wrote for my Technology and Intelligence class in early 2008 (STIA-432 at Georgetown University). It is meant to describe a few of the current problems faced and the nature of those problems, but not to offer up solutions. In the past year we have certainly seen the continuation of existing challenges coupled with the emergence of new ones. Today’s scientific and technological paradigm is by no means a simple one. But I do believe that with the collaboration of bright minds and the continued objective to ride and guide the progressive technological waves of the 21st century, substantial risks will be mitigated.

If History Could Tell

Since the establishment of the Office of Strategic Services in 1942 and subsequently the Central Intelligence Agency in 1947 (via the National Security Act), a core mission has been the collection and analysis of strategic, actionable information. This process has always required technology in the form of communications equipment, navigational tools, security systems, listening devices, and many more. Historically, the Intelligence Community as a whole has been way ahead of the technological curve, and in most cases, has established and controlled the curve. With information security and access to federal funds, various agencies have been given the ability to turn novel ideas into useful instruments for collection, analysis, and dissemination. However, history has become the past, and no longer dictates the way in which the world of technological development can move forward. Federal and international regulations, advancement in information theory, collaborative networks, and the global information age via the internet have all contributed to rapid, world-wide technological development that is no longer behind the IC on the tech curve. In the next decade, the Intelligence Community has the potential to fall even or behind any lines of global technological development, and as a result will find new struggles in all sources of intelligence, whether clandestine or not. Some arguments state that the IC, with some elements of special authority granted to preserve national security interests, will flourish as a developing technical lab for operations. However, the best and the brightest technical and analytical minds are not necessarily organized within the IC anymore, but rather are connected without boundary via the internet. Open-source development and the speed at which the commercial world can access capital may eventually move the IC technical approach to the back of the line.

The Whole is Greater Than the Sum of the Parts

Collaborative technologies have particularly flourished in the past five years. Social networking sites such as MySpace, Facebook, and Flickr, knowledge management platforms such as Microsoft SharePoint and TheBrain Technologies, and the entire blogosphere have accelerated communications without any distance barriers to get around. Information is passed, shared, and edited with the click of a button. SourceForge, an online network for open-source software development, has brought a vast array of new technologies to a market that never before existed. This lack of predictability for the technological market puts the IC in “catch-up” mode. Wikipedia, as well as other information warehouses, accelerates knowledge consumption for the individual – not just a business or state entity. With a horizontal, access-free, organizational structure, these applications have few barriers. Although the IC works to chase these technologies with A-Space and Intellipedia, an accompanying hierarchical structure and tiered-access system could truly dampen collaboration on a technological front.

Getting Small Could Lead To…

As the world grows in size and energy, the capability to pack information, data, and logic into smaller and smaller units continues to develop. Through nanotechnology and quantum computing, academic research groups as well as large corporations have minimized size requirements and increased processing speed in the same products. The associated power that now exists in these products outside of the Intelligence Community weakens the IC’s ongoing ability to leverage such products for foreign surveillance tactics with communications, imagery, measurements, and signals collection.

…A Much Bigger Problem

In the next decade, the IC and the United States as a whole will face incredible security and technological challenges. The tension will be increased as national policy will have to deal with finding a balance between civil liberties and national security interests. With recent information warfare events such as hacks into Pentagon computers, developmental advantages can change in an instant. International policies will also affect development within the U.S. government and could unfortunately give an edge to non-governmental organizations that have easier ability to practice CBRN weapons testing (with high-tech delivery instruments), removed from many international regulations. Unfortunately, if the Intelligence Community is to drift toward a more reactionary state, the technological and security risks become increasingly more serious.

A (New) Final Thought

It’s just as important to anticipate the wave as it is to ride and guide the wave. Surfers find waves through reaction AND proaction. The same goes for the collection, analysis, and technological development. There is more historical and real-time data than ever before. Deterministic and probabilistic models are more advanced than ever before. We can do something with all this data to find patterns and indications of technological risk. At the same time, we have more intellectual and psychological understanding of cultures around the world, and the associated mechanisms of travel, prayer, consumption, loyalty, and desire than ever before. Pairing one with the other gives us the connect-the-dot power that can truly shape our understanding and awareness of the world and the technological risks that threaten our security and sustainability as people.

The Intersection Of Expertise

As I begin my job search (25 applications in 2 days so far!) I keep asking myself how to describe what I’m looking for in a job and in what realm do I wish to work? There is no specific job title that describes my experience and education (e.g. “doctor” or “software engineer”) and there is no one department in which I’ve worked or wish to work (e.g. “Operations” or “Logistics” ). Yes, I have an academic background in mathematics & statistics yet it’s difficult to communicate why I have that academic background. I do not necessarily want to become a statistician but rather I fully understand the quantitative nature of things and the power that numbers, math, and quantitative methods have in all aspects of business, government, and life.

So where does this leave me? Well, unemployed and confused, for one. But that’s okay with me. I’m confident that with my capabilities, no matter how hard they may be to communicate in an application or even to a recruiter, I’ll find the position that leverages my abilities and motivation.

That being said, I think I’m at least getting close to describing where I stand, and in real-world terms. It’s at an intersection of sorts – between quantitative methods, scientific and technological realms, and the human element. It’s interdisciplinary – can fit within any group or team or stand alone as an independent researcher or consultant. It’s also dynamic – parallels the speed with which modern business operates and the flexibility required to optimally support the needs and requirements of many types of personnel.

I’ve used a similar image a few times, in posts on knowledge innovation and math in 2010 and beyond. Here I’ve intersected three main topics while including some of my strengths in the middle. Now if I could only match those to a job title…

At what intersection do you operate?