Skip to content

Information Architecture

Book Review-Anthro-Vision: A New Way to See in Business and Life

It sounds vaguely like a superpower.  It’s “Anthro-Vision – the power to see things the way they really are.”  However, that’s exactly what Gillian Tett is proposing for our everyday non-superheroes.  Anthro-Vision: A New Way to See in Business and Life is the application of anthropological ideas and approaches to help us see the things in life that we often overlook and, perhaps more importantly, to develop greater understanding and empathy for others.  Unlike the superhero version, it takes work, and also unlike the superhero version, it’s real and it’s real powerful.

Anthropology

What does the study of distant cultures and societies have to teach us today?  As I explained in my review of The Ethnographic Interview, the same skills can be applied to those who seem strange to us but who live next door.  The intellectual framework of anthropology tells you to question your assumptions and to be just a bit skeptical of what you do – and do not – know.  This framework helps us see what others are missing – what they’re blind to.

It also helps us avoid the kinds of problems that Lauriston Sharp found with the introduction of steel axe heads to Stone-Age Australians.  Published in 1952, the article “Steel Axes for Stone-Age Australians” explains that the material advance caused a breakdown of society.  Had someone realized the social relationship between the elders who owned the stone axe heads and the young men and women who borrowed them, needless suffering could have been avoided.  (See The Diffusion of Innovations for more.)

How many times do we introduce something to the world with no idea how the world will respond or even how they might approach it?  Sherry Turkle in Alone Together offers her concerns about a world connected by technology making us feel less connected.  It’s an anthropological framework that helps us understand – and thereby minimize – tragedies.

I see echoes and reflections of anthropology in other disciplines.  Design thinking (See Design Thinking), human-centered design (See The Art of Innovation), and user experience (See The Elements of User Experience) are all related approaches that seek to eliminate preconceived notions and start by a deep understanding for those who are being served.  The work of visionaries like Gary Klein, particularly in Seeing What Others Don’t, exposes us to the back half of the anthropological approach – that is, the sense-making process.  Even Dave Snowden’s Cynefin decision framework relies upon the ability to see the situation clearly before designing solutions.  Horst Rittel’s wicked problems came out of the study of urban planning – the same kinds of cultural problems that anthropology seeks to understand in what often feels like a chaotic world.  (See Dialogue Mapping for more on Rittel’s work)

Liminality

Tett quotes Victor Turner who uses a concept of “liminality,” drawing on the idea that most cultures employ rituals and symbols to mark transition points.  I’d extend this further to Arnold van Gennep’s work on Rites of Passage and how we signal our transitions between stages of life with rites of passage.  While Eric Erikson is best known for his work charting the development of children, particularly in Childhood and Society, he wasn’t focused on the rituals (rites of passage) that we use, because his focus was more inward development than outward signs.

However, it would be an error to ignore that the way that we recognize and celebrate rites of passage are important to our psychological development.

In fact, as Daniel Pink explains in When, we don’t share a single sense of timing.  The sense of timing we do have is often driven by external landmarks or external symbols by which we calibrate our sense of time – but we don’t all use the same landmarks and symbols.

A Lesson from the Jesuits

In a world where religion is often filled with immutable rules, rituals, and customs, the Jesuits seem to have found a way to separate the essential parts of their beliefs from the non-essential.  According to Chris Lowney in Heroic Leadership, they were able to adapt to their environments by demonstrating the principles they believed in without getting caught up in the customs that weren’t connected to their core beliefs.  When you needed to dress differently to fit in and be accepted, that’s what you did.

I’m reminded of this because the Jesuits had to question everything about their faith and strip all the unnecessary assumptions.  They had to see ways that their habits were getting in the way of their mission to bring the message of Jesus to everyone on the planet.

Side stepping the question of the religion and simply evaluating their behaviors and outcomes, their results are impressive.  The founded universities and they lived in harmony with people of different beliefs.  They had to be – in essence – anthropologists.  They couldn’t judge others by their strange customs but instead had to learn them and, if they didn’t conflict with their core beliefs, accept them.

Finding the Hidden Assumptions

It’s a cold day, and you decide to go for a walk in the local indoor mall.  Which side of the hall do you walk on?  Most of the world answers on the right.  They instinctively do it.  They wouldn’t have thought about it had you not asked.  However, those in the United Kingdom and New Zealand, who drive on the left side of the road, instinctively try to walk on the left side of the mall’s enormous halls.  There’s no rule for this, but it just happens.  It’s a hidden assumption.

You’re a guest at a home in the Midwest of the United States, and you clean your plate.  Your host is pleased, because it signals to them that you were happy with your meal.  The next week, you’re in an Asian country in another host’s home.  You clean your plate, and the host seems subtly frustrated, continuing to ask you if you want more food.  You politely decline.  Your host is offended – and you don’t know why.  What you don’t know is that it’s customary to leave a small amount of food on your plate to signal to your host that you’ve had enough food.  In one context, you’re being a polite guest; in the other, you’re being rude – without even knowing it.

The point of Anthro-Vision is to be able to see these cultural differences, assumptions, and hidden meanings that exist in ways that we don’t realize.

Discussing the Undiscussable

There are some things you just don’t discuss – in polite company.  Depending upon where you were raised, it might be sex.  It might be money.  It might be bodily functions.  The familial, organizational, or social norms may make it rude to discuss such topics.

However, as an outsider, it’s possible to ask the questions that others can’t ask.  You’re given special grace, because everyone is clear that you don’t know the rules.  It’s sometimes the kind of break that it takes to make things more discussable.  A factor that begins to open doors is that once others realize you’re talking to others in their “in” group about topics they can’t talk about, you’ll be invited to talk to them.  It can create some pull towards getting the interviews you want.  (See Dialogue – Defensive Routines, and Organizational Traps for more about undiscussable topics.)

Espoused Beliefs and Beliefs in Practice

It doesn’t make sense.  What people are saying about the way that their lives work doesn’t match the outcomes.  They say they believe in using less water, but they are endlessly watering their lawns.   The water usage is indicated as high, and people reply to surveys that they’re very water conservation focused.  The real reason may be that they don’t understand how their behaviors are impacting the situation – or it can be that they have a competing belief.  There may be something that’s even more important than water conservation to them.

Immunity to Change encourages us to look for these places of inconsistency and ask why.  Steven Reiss in Who Am I? and The Normal Personality encourages us to look where motivators are in conflict and figure out which motivator is holding back the behavior that we want – and that the person espouses.  Anthropologists simply watch what people actually do and then ask them why.  It’s a pragmatic approach that helps to resolve the discrepancy between the quantitative results and the beliefs that we have about how people will behave based on what they’ve told us.  (You may find The Largest Gap in the World – Between Saying and Doing useful when you can’t reconcile what people say and what they do.)

All Problems are Human Problems

William Isaacs in Dialogue describes four pathologies of thought, the first of which is abstraction.  Tett explains that the financiers at the heart of the financial system meltdown didn’t speak in stories of people or include pictures of them in their presentations.  They had abstracted themselves from the individuals who were taking out loans and trying to pay them back.

Anthropology acknowledges that all problems are human problems.  If you’re not sure how you’re connecting with and improving the lives of real people, then something may be wrong.  If your vision doesn’t involve the people, then something is missing.

We’re Not Safe

There’s this nostalgic view of the past where crime didn’t exist, and no one locked their doors.  The perception today is that we must protect our children from murder, violent crimes, abductions, and more.  However, the problem with both of these perceptions is that they’re simply not true.  The statistics compiled by LetGrow from the FBI, Pew Research, Bureau of Justice Statistics and others don’t support this perception.  The murder rate isn’t materially different than 1960 – and are roughly half of what they were in 1990.  In short, the real rate hasn’t changed.  The quantitative analysis is net-zero.

However, the qualitative view is that parents are substantially more afraid of violence against their children than their parents were.  The belief is that the world is more dangerous, and therefore children can’t be allowed to go out like they used to.  The result is that their children are escaping into cyberspace, where they’re free to meet others and hang out.  America’s Generations describes how millennials were helicopter parented because of parental concerns about their safety – irrespective of reality.

What did change about violent crimes was our awareness of them.  The rise of television, news, the internet, and pictures of missing children on the walls of Walmart left us believing that the prevalence of the crime was much larger, because we were hearing about it more.  The results were overprotected children.  The Coddling of the American Mind explains the outcomes that we get with children who were coddled and overly protected.

Treat Life Like Fieldwork

My favorite part of the book is where Tett explains that she had an ex-boyfriend complain she was a terrible person to go on vacation with, because she treated it like fieldwork.  It’s my favorite, because she gives a cry for all of us: “I treat life like fieldwork.”  I can’t think of a better way to live life – always curious, always learning, always questioning.  Maybe that’s what has allowed her to develop her Anthro-Vision – and what can lead you to develop yours.

Book Review-The Long Interview: Qualitative Research Methods

It’s easy to get wrapped up in big data, AI, and quantitative approaches to research and forget that there’s another dimension to research that is just as important as – if not more important than – the numbers that we seem to be driven by.  The Long Interview (part of the Qualitative Research Methods Series) is one approach to qualitative research that can provide a semi-structured approach leading to important answers.

Qualitative Methods

Quantitative methods use large numbers and provide calculations and statistical formula to produce results that can quantify what is happening – but they’re unable to explain why.  Qualitative research methods flank the quantitative methods, being used both before quantifying metrics to define what to capture and after quantitative results are seen when there’s a need to explain them.

In the pre-quantitative mode, qualitative methods like ethnographic interviews (see The Ethnographic Interview), participant observations, focus groups, or in-depth interviews – and to some extent The Long Interview – can create an understanding of a topic area, which can be used to structure questions and collection of data that should be meaningful.  In these cases, qualitative methods ensure that what is captured and analyzed is relevant – it matters – and is accurate – it’s free of unnecessary ambiguity.

Conversely, the same methods can be focused on a specific result or situation for the purposes of exposing why the results are what they are.  Can’t explain why you can’t convert leads?  Qualitative approaches can help you figure out why.  You’re seeing a spike in sales in one region and want to replicate it?  Qualitative approaches can help you identify why sales are spiking to see if it can be replicated.

Where quantitative methods involve rows and columns of numbers, qualitative approaches ultimately center around conversations, many of which may be one-on-one.  Ultimately, it’s a short-term relationship between an interviewer and a respondent.

Waterfall and Iterative

There has been an ongoing discussion in the software development profession for three decades now – not quite half the lifetime of the profession.  Should software be designed in one big pass, like bridges are built, or should they be built bit-by-bit over time, like a pearl?  The iterative, like-a-pearl approach is generally perceived to be slower and more expensive, but in real life, we find that it often works better – in some circumstances.

The critical difference between bridges and software was – and often still is – that the mechanical characteristics of the materials of a bridge are well known.  There are many previous bridge building plans that can be reused or at least adapted to the purpose of building a new bridge.  In short, the details don’t change, and they’re well known in advance.  Software rarely – but occasionally – fits this definition.  There are some projects that can be built in one fell swoop – and should be.

The alternate end of the spectrum are projects where the technology is unproven, and the user expectations aren’t set, so anything can happen.  In those cases, the degree of uncertainty justifies a slower, more iterative approach.  The point of the iterative approach is to create more learning about the situation so that the investments aren’t so large.

Qualitative research is the kind of iterative, we don’t know the territory, investigation.  It’s what we do when we don’t know how we’re going to get to the answers.  Driving down a well-known road with quantitative research and big data is certainly faster – but someone has to know where people need the road to go and build it.  It’s important to realize that research isn’t an “either-or” proposition but an “and” proposition where qualitative makes quantitative more effective.

Respondents

In the context of social or anthropological research, there’s a challenge that respondents “lead hectic, deeply segmented, and privacy-centered lives.”  That makes it difficult for them to dedicate time to the qualitative interview process – and less likely to share when they’re in the interview.  It’s not easy to get people to reveal deep insights into their world because they don’t see them – and if they did, you’d need to build sufficient trust quickly to earn the right to hear their story.

Therefore, techniques like those shared in The Ethnographic Interview, lessons from Motivational Interviewing, and developing a deeper level of trust is essential for qualitative research success.  (See Trust=>Vulnerability=>Intimacy, Revisited for more on developing trust.)  While qualitative research doesn’t require as many people, it does require the rapid development of a deeper understanding of those people.

Qualitative research takes time and can be perceived to be expensive, but it’s an important component to ultimately understanding the overall picture.

Nine Issues

McCracken outlines nine issues that relate to qualitative research:

  1. The Social Scientific Research Community – How should qualitative research fit in with other methods?
  2. The Donor Social Science – How do we bring the various qualitative research into a coherent conversation?
  3. The Qualitative / Quantitative Difference – As we described above, qualitative and quantitative work together – they are not competitors.
  4. Investigator as Instrument – Quantitative research can be scaled because it doesn’t require a human. Qualitative research requires and is influenced by the investigators – for better or worse.
  5. The Obtrusive / Unobtrusive Balance – There’s a need to push the respondent – and there’s a need for the investigator to step-back and listen. Finding the balance isn’t always easy.
  6. Manufacturing Distance – McCracken uses distance as a term for detachment and actively minimizing assumptions. (See The HeartMath Solution for more on detachment.)
  7. The Questionnaire – The point of the questionnaire in the long interview is to provide some structure to the conversations.
  8. The Investigator / Respondent Relationship – Here, McCracken is focused on the level of formality / informality in the relationship between the investigator and respondent.
  9. Multimethod Approaches – Here, McCracken is illuminating the need for multiple approaches.

Rummaging

The messy bit about qualitative research is that it draws from the investigator’s personality, experiences, beliefs, and skills.  There’s no one direct path to the answer.  Marcia Bates speaks about what we’ve learned by means of active-passive and direct-indirect.  She estimates we get 80% of our knowledge in a passive and indirect way.  In short, most of what we know we didn’t seek to know.  When we’re working with respondents, their experience of their lives isn’t directed, and therefore we shouldn’t expect it to be a single, straightforward set of questions that will lead to a clear understanding.

Knowledge Management

Knowledge management concerns itself with explicit – contextless – information and the kinds of implicit or tacit knowledge that are hard to describe.  What’s wonderful about qualitative approaches is that they are a process through which some implicit information becomes explicit.  The investigator is the process through which the information is converted.  The investigator uses the long interview and other qualitative techniques to sense make what they’re hearing from the respondent and to elicit those things that the respondent rarely thinks about directly.

Four Steps

The Long Interview is four steps:

  1. Review of Analytic Categories – This is the orientation phase where the investigator begins to understand the overall landscape and the way that things appear – on the outside – to relate.
  2. Review of Cultural Categories – This is preparation for how to ask the questions. It includes the development of a questionnaire and any materials that may be necessary to support the interview.
  3. Discovery of Cultural Categories – This is the interview.
  4. Discovery of Analytic Categories – This is the post-interview analysis.

The Analysis

The analysis is itself broken into stages:

  1. Transcript / Utterance
  2. Observation
  3. Expanded Observation
  4. [Connected] Observation
  5. Theme
  6. Interview Thesis

In Sum

The process laid out provides structure to what might normally be addressed as an ethnographic interview.  While it’s more efficient than a less structured process (or can be), McCracken still cautions that you should not use qualitative methods unless you cannot use quantitative ones.  I’d soften that a bit to say you should only use qualitative approaches to the extent that they’re required.  The irony is that the great benefit of The Long Interview is to avoid the longer interview.

Streamlining Search

There was a shift that happened over a decade ago. It was the shift that moved from no good full-text search options inside of an organization to having a few good options for doing full-text search inside of documents. In a moment, people felt as if they could abandon the difficult process of entering metadata and instead just rely on full-text search to help them find whatever they wanted.

It worked when it was tested, but in practice the results generally contained too many files. It was as difficult to sort through them as it was to navigate the old structure or enter the metadata to make the documents findable. In short, it worked in the small scale, but not in the large scale, so the pendulum swung back towards a strategy that included traditional findability and browsability approaches augmented with full text search.

Today, we have tools at our disposal that can help us optimize the ways that we store and retrieve information to make it easier in both directions.

Database Queries

In the distant past (a few decades), the way that you located documents inside of a content management system was to provide a search field-by-field. If you didn’t have the locator number, you had no chance of finding the information you wanted. As a result, investments were made to ensure that the metadata was entered – and entered correctly. Quality control, double entry, and verification was the name of the game, and it was a big game.

The problem was that the process of getting the right metadata in was expensive for anything except for operational records. Operational records could be output from one machine and stored in the content management system without human intervention or error. The fields that were indexed were consistently provided, because they were done by the interface as it dropped files into the system.

Enter Full Text

A boon to finding documents that weren’t emitted by a system, the introduction of full text promised that no one would have to enter the invoice number – the system would use optical character recognition (OCR), find it, and instantly display the document. The limitations of OCR and the 90ish percent accuracy and the commonality of the numbers quickly turned a search for a single document into hundreds of results, as users struggled to articulate that it was the invoice number they wanted, not the purchase order number, requisition number, or any of the thousand other serial numbers that occur in an organization.

Some organizations had already implemented prefixes to simplify identification and disambiguation, but many had not; they started doing data entry again, and users started searching specific fields. The good news for full text search engines is that they would accept a search for either the full text or the metadata and would prioritize the metadata in the results. So, while there would still be hundreds of results, the invoice you were looking for was on top.

Real Refiners

Soon after, we gained the capacity to take those metadata fields and use them to refine your search. You could search for a ZIP code 01234, and the results would appear with refiners that could be used to filter the results. If there were results from five customers, those five customers would appear in the refiners pane and – importantly – the other 30,000 customers’ names wouldn’t be there. This filters extraneous noise and allows users to pick the way they want to filter results by a small list that is easy for them to process.

While clicks are generally bad things, the added value of a focused list was worth the clicks and the wait for the revised search to complete. In short, it was quicker and easier than the user trying to sort out the right result for themselves. After a few refinements, the results list was small or the item that you really wanted on the top.

Unnecessary Clicks

While the clicks for refiners are valuable to the person looking for a document, clicks that are necessary to get them from their result to what they really want aren’t. In some cases, the search result would simply lead to a container that had what the person was looking for. This is the case particularly often when the information has a listing or summary on the page that was returned but also occurs when the folder or container has metadata on it that the items themselves don’t have.

Soon, users wonder why they must do all the extra clicks at the end of the search. It feels wasteful and frustrating.

Pogo Sticking

The extra clicks problem is particularly frustrating when the person doesn’t exactly know which document they need and are forced to do a few clicks per result to evaluate whether the result is what they wanted or not. This pogo sticking problem is one of the reasons why modern search results include previews of the document right in the search results window (usually as a popup), so that users can quickly hover over a result and, ideally, discover whether the result is what they want or not.

Extra clicks between the results and the desired documents breaks this functionality and makes it harder for users to get the results they need.

Slick Search

A slick search is aware of the context in which you issued the search and the social network around you to identify what things you’re most likely to be looking for, and it makes suggestions for common mistakes. However, more importantly, the search leverages the metadata from containers and pushes the appropriate metadata to the documents, so that search results are the actual documents the users are looking for rather than containers.

Getting to a slick, streamlined search is about making it easy for people to get the right result. That means providing easy ways to reduce the results to a manageable number and evaluate individual results quickly.

When Browsability is Broken

It’s familiar and convenient to browse for files – until it isn’t. Our human brain has limits; even with the familiarity of having created the files ourselves, our ability to remember, recognize, and retrieve the right file at the right time is limited. So how do we know when we should stop trying to browse for files and should lean on search? Perhaps more importantly, how are our small decisions pushing us towards a browsability approach when we need to be more focused on how search can help us find the documents we need?

The Magic Number 7 +/- 2

It was 1956, and George Miller at Harvard had done some research about our working memory. He concluded that we had the ability to maintain in our heads about seven items. More specifically, he identified seven plus or minus two. What this means for the way we process lists is that we’re capable of processing about seven items in a list. Later replication of Miller’s findings in European languages seemed to a suggest a slightly lower number (five), which was later attributed to the fact that we have about a two second audio buffer in our heads, and European languages take slightly longer to convey the same information. This is close to Gary Klein’s later work as documented in Sources of Power, where he explains that we can simulate a system with three factors and six states.

The short version is that the human brain seems effective at processing small lists but is inefficient at processing large lists. In fact, what appears to happen is that when we encounter a large list, we start to scan subsets of the list to look for the items that we’re interested in.

Hicks Law

As the world wide web was growing, there was a great deal of interest in Hick’s Law and the need for everything to be accessible within three clicks. Hick’s Law says, in brief, that in well-ordered lists, one large list is more efficient than two smaller lists. Many people used this to justify very long lists of menu items and countless links on navigation pages.

However, the important caveat to Hick’s Law is that the listing must be well-ordered. The argument has been made that an alphabetical listing is a well-ordered list, because it follows a very predictable pattern. In that sense, it’s true; but there’s a larger problem, and that’s the problem of synonyms. (See Pervasive Information Architecture for more about Hick’s Law.)

Synonyms

The reality of our mental processing is that we think in concepts and then later apply words to those concepts. Is it a shirt or a blouse? Conceptually it’s a “top” or covering for the top of the body. It’s only when we begin to think about the context we’re using it in that we can find the “right” word for the situation.

The problem with ordered lists is that they must be contextless – rather, it’s not possible to perfectly correlate the context of the designer of the list and the consumer of the list. The result is that we need to consider synonyms when searching a list. Shirt, top, and blouse occur in radically different positions in a single list. This invalidates the work that leads to Hick’s Law and put us at the mercy of the anxiety created by the paradox of choice.

The Paradox of Choice

More choices seem like a good thing. However, as Barry Schwartz elegantly explains in The Paradox of Choice, this isn’t always the case. In fact, the research seems to indicate that there is an anxiety created when there are too many choices and no clear directions to move in. Certainly, our goal in creating systems isn’t to create anxiety in our users, but that’s what we do when we create large lists.

So, while Hick’s law implies that we should have larger lists, the paradox of choice pushes back on this assertion with anxiety. Collectively, this effectively invalidates Hick’s argument in the context of unstructured data. As a result, we’re back to finding ways to shrink our lists towards the kinds of numbers that our brains can handle well.

Folders and Directories

In addition to the human factors, there are technical reasons to restrict any given file directory to a few thousand entries. Whether the technology is traditional file systems or content management systems based on relational database engines, more than a few thousand files in a single directory (or query) can be problematic. For performance reasons, it’s best to keep entries at a few thousand entries or fewer – even if the users never directly access the files by browsing.

As a result, in addition to the natural foldering that might be used to separate files with different metadata, there are often time-based foldering strategies that keep individual directories to only a few thousand files.

Collectively, this makes finding files by browsing harder – even if the human factors limiting the number of items is ignored.

The Browsability Number

There is no one number between the magic number seven that Miller proposed in 1956 and the technical limitations around thousands of files that can be browsed in a single directory. As the number of files increases, the anxiety and frustration increase. The subset scanning strategy that we as humans use tends to break down by about 100 files, so any situation where we can’t reach fewer than 100 files is unlikely to be easily browsed regardless of any file naming conventions or other organizational techniques that may be in use.

The Search Solution

The solution to the problem is to switch from a browsability strategy to a search focused strategy. Browsability-based solutions are focused on single-dimensional naming strategies and large directories. A better strategy is to develop a rich search strategy that leverages metadata and search refinement to create ways for people to leverage technology rather than attempting to manage the identification process internally.

Search refinement allows for initial criteria to be specified and then further refined by selecting metadata in other columns or dimensions. With a well-built taxonomy, the process of searching for documents with refiners is quick, and because of the limited number of options, it is not anxiety-producing. One key to this is the fact that search refiners don’t show every possible value but instead show only those values that exist in the results that are already displayed – this focuses the searcher into only those options that are relevant to the current context.

Browsing works when there are only a few files, but at the scale of thousands, tens or hundreds of thousands, or more, search is the only way to go – and it’s focused on metadata and refiners.

When we work on file naming conventions, we’re necessarily working from the perspective of browsability, even when it may be broken. (See File Naming Conventions Miss the Point for more.)

File Naming Conventions Miss the Point

I use file naming conventions. I recommend them to clients. However, they’re not the point. They’re a means to an end when it comes to managing files. They’re a way to apply metadata to the file name in a repeatable way. However, when they become more important than appropriate focus on developing an information architecture that works, we’ve missed the point.

Metadata Less Systems

Much of the focus on file naming conventions is the result of the fact that most of the systems that people use only allow for one piece of user-entered metadata – and that is the file name. The advice in this scenario is to create a delimited approach to embedding multiple pieces of metadata into the file name so you, as the human, can parse it out.

While the strategies vary widely – and often hinder findability – they fundamentally consist of taking components of the name and embedding the metadata. For instance, my invoice file naming convention is YYMMDD-INV#-Client-Project. It’s designed to separate out four pieces of metadata in the name of the file: the date of the invoice, the invoice number, the client, and the project.

I could easily create separate folders for each client and project and file the invoices in those folders with only the date and invoice. When I do this, I’m moving the encoding of the metadata into the hierarchy and out of the file name. I choose not to do this because invoices for all customers are processed in batches. To file invoices in individual client and project folders would require more work to navigate to the client and project folders and would provide little (if any) value.

In a system that doesn’t inherently support additional metadata for files – like the file systems on our computers – the right answer is to find an information architecture that supports the processes that you need to use and involves some sort of structured response to either the file naming or the placement of the files – or both.

Sidebar: Dates in Names

Every system keeps intrinsic metadata about files in terms of its creation date and its modification date. In most systems, particularly file systems, it’s sortable. This leads to the question about why we should put the date in the filename. The answer is that, because these intrinsic dates aren’t settable, we can’t establish the date for the file – and in rare cases, this can be problematic. So, while in most cases the actual document date isn’t sufficient to warrant inclusion in the file name, because of their accounting importance, it seemed like the right answer.

Enterprise Sync and Storage

It’s been years now since the war between file-system-based approaches to content storage and more traditional systems with metadata, which duked it out in the market to see who would win. It rapidly became apparent that file-system-based file storage was going to win by sheer volume. As a result, the metadata-based content management systems retreated a bit, and the analyst companies began talking about enterprise content management as a part of a larger conversation about enterprise file synchronization and storage.

For most files, the effort to enter metadata wasn’t something the users were willing to do, and as a result, everyone gave up. Instead, they focused on the high importance and high value files and placed them into content management systems with metadata, ignoring the lack of metadata on most files.

Field Stacking

If we go back to the origin of computers, we saw techniques that squished multiple types of data into fewer, smaller fields. Back then, the reasoning may have been to avoid having to change a master record definition, concern about storage space, or a myriad of outdated reasons that led to smashing things together. What we learned as a result of this effort is that delimiting the data once it was smashed together was difficult.

Conceptually, parsing data out is simple. Look for the comma, semicolon, or other delimiter, and you’re done. In practice, these delimiter characters occur naturally in the data, so you must disambiguate between the intentional single quote and the attempt to terminate the string. We recognize that there are times when field stacking must still be done to work around limitations; however, it’s generally discouraged where possible because of the challenges that it creates. While we use file naming conventions to address limitations, we should do so only when we must.

Long Names and Cryptography

One of the other factors with stacking fields into a file name is that the names get long. The more information there is – and the longer that information is – the closer we approach file names that Windows can’t handle well. The original APIs for accessing files on Windows have a 260-character limit for the full path for files. While many applications have moved to newer APIs that have a much larger character limit, there are still numerous programs that are limited by 260 characters, thus long names and long folder names can become a problem.

The solution to this is to create standard abbreviations and codes to shorten the name. This creates the additional problem that the file naming convention becomes so complicated that it is difficult to train people to use and becomes fraught with errors related to incorrect use of identifiers, abbreviations, and shortcuts.

Proxy

If we get to the fundamentals, the file naming convention is addressing the limitation of not having metadata support in the underlying system. It’s a proxy for having the information with the perception that, in the future, we’ll have an option to extract the metadata out and place it into appropriate buckets.

The rub comes in when we’re working in systems that are inherently capable of maintaining and managing metadata. In those cases, do we continue to invest primary effort into developing, maintaining, and enforcing file naming conventions, or do we shift our efforts to ensuring that the metadata is set correctly?

Metadata Advantages

File naming conventions are plagued with the problem of readability to the user. Fundamentally, they’re designed to allow a user to parse out important information to determine what they want. Because of that, the question becomes whether you use the friendly information or the record identifier. Said differently, do you use the name of the company or the company identifier? Names are easier to read but are subject to conflicts (e.g., two “Acme, Inc.” companies) and name changes.

Metadata-enabled solutions can conveniently side-step this issue by recording the unique identifier but displaying the name. Instead of having to choose between two difficult alternatives, metadata allows for easy and consistent identification of the record.

Additionally, both file name encoding of metadata and folder-based encoding of metadata is subject to a single navigational path. If you start a name (or path) with a date, you fundamentally enforce this approach on others as they’re trying to find the file. Search technologies are sometimes helpful at finding a file based on part of the name. But because they make no distinction between parts of the name, collisions frequently occur where the Pear Tree Landscaping company’s records are almost unfindable when many of the projects that you’ve completed are “Pear Tree.”

Metadata-based search approaches can be focused on a specific field, thus eliminating both the navigational path problem and the potential of matching based on another field. This is why nearly every content and record management system today offers storage for metadata in individual fields.

What’s the Point?

The reason for file naming conventions was always file findability. The key to findability was identification of the key metadata – which became part of the file naming convention. These key pieces of metadata were organized in the approach that was expected to be the most valuable, recognizing that one way of finding files would necessarily have its limitations. For those still using file system-based approaches, file naming conventions are all we have to improve findability. However, for systems where metadata is available and searchable, it offers a much better way of making files more findable.

That isn’t to say that file naming isn’t valuable, it’s just that it’s not as valuable as getting the metadata into the correct fields.

Book Review-How Buildings Learn: What Happens After They’re Built

I’m not an architect, but as an information architect, I’m curious about how classical architects approach the problems of buildings that people love. This journey led me to Stewart Brand’s book, How Buildings Learn: What Happens After They’re Built. I was first introduced to the book back in 2011 while reading Pervasive Information Architecture. It surfaced again in Nassim Nicholas Taleb’s book, Antifragile, and in Peter Morville’s Intertwingled. In every case, the reference is to how buildings change (or, in Brand’s language, “learn”) over time.

In information architecture, we’re faced with a rate of change that Brand and his colleagues couldn’t comprehend. While the idea of buildings being torn down in a few decades was alarming to the architects, as information architects, we don’t expect that our architecture will last a decade. The rate of change is too high.

Buildings Shape Us and We Shape Them

Winston Churchill famously said, “We shape our buildings, and afterwards our buildings shape us.” This is a simplification. In truth, after we have built a building, it shapes the way we interact with one another, and then we revise it to fit our new needs. It then further shapes us, and we repeat the process of adapting it.

The biological point of view is that of ecopoiesis – that is, how an ecosystem is formed. There’s some starting event (building a building for instance) and then continuous co-evolution of the organisms (humans) and the environment (building). It’s true that we shape our buildings and then they shape us – and vice versa.

However, like any ecosystem, the rate of change and adaptation isn’t even across the entire system. Some parts of the system change quickly, and other parts of the system move more slowly. It’s these sheering layers that make changes in buildings so interesting.

Sheering Layers

Brand built on Frank Duffy’s work and solidified a model for different layers of the building that operate at different speeds. Look at the diagram from the book:

Here, the layers are all represented. The idea is that the site is permanent (at least as permanent of tectonic plate movement). The stuff in the inner layer is ephemeral. The changes in the stuff is very rapid compared to the rest of the building. Let’s look at each layer:

  • Site – Permanent.
  • Structure – The most persistent part of the building. The lifespan of structure can be measured in decades to centuries. When the structure changes, the building has changed.
  • Skin – The façade or outer face of the building is expected to go out of style and to be replaced every 20 years or so to keep up with fashion or technology.
  • Services – These are things like HVAC, elevators, etc., which simply wear out over the period of seven to fifteen years.
  • Space Plan – Commercial buildings may change occupancy every three years or so, driving a change in the way internal space is allocated. Domestic homes in the US are, on average, owned for 8 years.
  • Stuff – These furnishings and flairs change with the seasons and the current trends.

The rates of change for different layers occurring at different times creates sheering forces, where the slower-moving layers constrain the faster-moving layers. The stuff can only change and grow so much before the space plan gives way and, ultimately, before the structure itself would need to change. Anyone who has watched a reality television show on hoarders knows that their propensity to acquire stuff is limited by the space they have.

Different Types of Architecture

It turns out that, while all buildings share the same basic layers, there are categories of buildings that operate very differently. Commercial buildings are subject to market pressure and frequently changing tenants, making them more volatile. Domestic (i.e. residential) buildings have owners for longer periods of time and tend to be adapted with smaller changes rather than wholesale renovations on a periodic basis. Institutional buildings are relatively fixed and permanent, as the structure itself becomes a symbol of the institution. Institutions are often holders of trust, and change is resisted as much as possible. (See Trust: Human Nature and the Reconstitution of Social Order for more.)

The sheering layers vary inside of these different types of buildings. A commercial building might replace the services at their planned end of life, because the outage is more disruptive than can be reasonably tolerated. Domestic buildings often run their services until a complete failure. Institutions behave more like commercial buildings in their replacement of services but almost never replace their skin, thereby more closely modeling domestic buildings.

Problems with Architecture

Brand spends a considerable amount of time discussing what is wrong with architecture and why it’s such a struggle to get good buildings.

Overspecification

The top culprit seems to be overspecification. That is, little is done to ensure that the building is adaptable to the purposes the occupants have once they’re in the building. Buildings are built so that it’s difficult to get wires through walls, making it harder to adapt the latest technology. All buildings are predictions, and all of them are invariably wrong.

Brand breaks these into high-road and low-road buildings. The former are what architects typically build. They’re bright, shiny, expensive, newer, and difficult to change. Low-road buildings, by contrast, are more adaptable. They adjust to meet the needs of their tenants, and their tenants don’t mind adjusting them to fit their needs. These older buildings may not be a perfect fit for anything, but they’re a good enough fit for most things.

Leaky Roofs

Frank Lloyd Wright may be the greatest American architect of all time, but I don’t want one of his buildings; all his roofs leak. He, in fact, quipped that it is how one knows it is a roof. By the 1980s, eighty percent of all post-construction claims were for leaks. During the same time, malpractice for architects was higher than for doctors. It would be tragic if it weren’t preventable.

We know what makes roofs that don’t leak. We know that flat roofs are going to leak – period. We know that, the greater the pitch, the less likely the roof is to leak. We’ve got all the knowledge, but because of the desire for appearance, we sometimes ignore what we know.

Wrong Metrics

What gets you into Architectural Digest isn’t how tenants love a building. What gets you into Architectural Digest are the pictures taken before people are in the building. It’s all about the design and none about the use. While some progress is being made in getting better metrics that measure – *gasp* – what occupants think, the progress is painfully slow.

If your measure is on pictures that have no relationship (or very little) to reality, it’s no wonder that occupants of the building aren’t happy. Architects aren’t motivated to make the actual users happy with the building they get.

Poor Learning

Brand admits most of the architects he knows are hustling just to survive. That makes it difficult (if not impossible) for them to invest the right amount of energy learning about the latest materials, techniques, and ideas for improving their trade. While there are standards for continuing education now, Brand seems concerned that these aren’t sufficient when architects are under such constant pressure to produce just to survive.

What’s Love?

Still, some architects buck the trends and create buildings that people love – real people who really occupy them. Those architects, as they age, have found ways to create buildings that not only fit the occupants from the start but also adapt gracefully over time. This is a rare condition. My first highlight in the book is “Almost no buildings adapt well.” Adaptability and age seem to be the key ingredients to get people to love a building.

Hire an Architect

Brand is clear that the power in building buildings doesn’t reside with the architects. While they’ll be called in for a few showplace buildings, the developers do most of the action. They may consult an architect at some point, but the architect rarely runs the show.

Because architects are often relegated to a small percentage of buildings where art is more important than functionality, the industry has become stuck. Most people don’t believe that architects are required, because the results they get when using an architect don’t seem to justify the expense.

Habitat, Property, Community

Buildings need to be three things at the same time. They’re a habitat for their occupants. They’re the property. At the same time, they’re also a part of the community. Buildings must fit their occupants and the place they’re in. They must sit on the site that they’re built on (with rare exceptions). Buildings aren’t one thing. They themselves are a sheering layer between the wants and desires of the occupants and the desires of the community.

Markets, Money, and Water

There are three things that change buildings: markets, money, and water. If the market changes and the location (site) is no longer desirable, then changes will need to be made to the building to keep it acceptably interesting to potential occupants. Markets can cause buildings to be built and torn down in rapid succession. (Consider the churn of casinos in Las Vegas as an example.)

Money can mean radical changes to the building. A lack of money can stagnate change or send the building into an inevitable death spiral of maintenance and repair that, in turn, sends occupants scurrying for a new place.

Water is the great destroyer of buildings. David Owen said, “Houses seem to deteriorate from the bathrooms out.” It makes sense. Bathrooms are the place where there is the most water inside of a home, and, too frequently, the water isn’t vented outside. Most homes are built of wood and other materials that don’t do well with prolonged exposure to moisture. Mold grows in the presence of heat and moisture. Homes are designed to be warm enough for their human occupants, and bathrooms are nearly constantly moist.

Not all the changes brought on by markets, money, or, particularly, water are appreciated.

Vernacular

It’s what they say. Vernacular is the native language of a region. When it comes to building, it’s the native way that people build. The way that homes are built in the North and the way that they’re built in the South are different. They’re built different in the extreme temperatures of the East Coast from the houses in the relatively stable temperatures of the West Coast.

There’s not one right way to build a house or a building. However, there are ways that are more suited to the materials available in the region and the techniques that are effective in the climate. Architects do well to understand the region that their creations are going in, so they can mimic the best practices of the region.

Oversize Your Components

The best advice that Brand has is to oversize your components – he expresses this in every way, including the structure. You want to oversize the carrying capacity of the structure, so that new floors can be added. Before computers were available to optimize everything, builders added more capacity to ensure that everything would just work. Oversizing components creates a “loose fit.” That is, the occupants can decide later to adapt the building. As was mentioned above, overspecification is one of Brand’s key concerns.

Open Offices

Before ending, it’s important to note that Brand spends a bit of time speaking of the fallacy of open offices. He explains that people want acoustic privacy but visual transparency. He explains that the initial experiments with open offices didn’t exactly succeed, but that didn’t stop the fad from catching on – much to the dismay of managers.

I’ve run across the open office idea numerous times in my career, particularly as I’m coaching clients on how to be more collaborative or innovative. They can’t seem to let go of the idea that they want a more open, collaborative space – until they’ve done it once. Everywhere you look, you’ll find evidence that it doesn’t work… but only if you’re willing to look.

For instance, in Creativity, Inc., Ed Catmull explains the genius behind the Pixar buildings and how they encourage interactivity. At first glance, the open concept is in. However, on deeper reading, you realize that Jobs made a common space where people could go to. It was a space that needed to be crossed for them to get to their private spaces.

Sometimes you just can’t stop a fad with facts, experience, or rationality.

Information Architecture

I read How Buildings Learn to stabilize my understanding of information architecture and learn from building architecture. The few key takeaways that I already knew but was reminded of are:

  • Vernacular – Build to the environment you’re in. In information architecture this means using terms that are familiar and approaches that work well with the users.
  • Plan for Change – Buildings get bad marks for their adaptability. Information architectures fail if they’re not hospitable to change.
  • Change Sheers – Change doesn’t happen at the same rate. There are parts of the system that should be designed to change slowly and others to change much more rapidly.
  • Oversize Your Components – While all predictions are wrong, predicting that things will grow is a safe bet. When you’re considering whether to put something in up front to be prepared… do it.

Whether you’re building an information architecture, a real building, or neither, I’m pretty sure you’ll learn something important from How Buildings Learn.

Book Review-Knowledge Management Matters: Words of Wisdom from Leading Practitioners

One of the many things that I enjoy about my life is the ability to walk from one world to another in a matter of moments. I’ve been a part of the knowledge management community for several years now. While far from all my time is spent in the community, I’ve come to know and respect many members of the community who are passionate about making the knowledge that each of us has more helpful to everyone else. That’s why I picked up Knowledge Management Matters: Words of Wisdom from Leading Practitioners. I wanted to know what they had to say about how we can better leverage what we know.

I’d count more than one of the authors as friends, and so many of the book’s conversations rang true to our prior conversations and discussions. But at the same time, the clarity that comes from writing a chapter for a book was helpful to distill conversations over the years into clarity.

Evolution of Knowledge Management

Nancy Dixon provides some evolutionary context to knowledge management. The framework she provides helps to understand the forces that are changing knowledge management. Just as one great continent doesn’t make sense until you understand how tectonic plates have been moving, it’s hard to understand the forces in knowledge management without an organizing framework.

From the relatively simplistic and formulaic solutions for information management through the ability to manage experiences and onward into an era of managing ideas, we’ve been on a journey to build systems – both technical and non-technical – to help us adapt, cope, and even flourish in a world where information and knowledge are as vital as gold was.

Coming from a technology background, I had a front row seat as our capacity to create and manage information exploded. Moore’s law is interesting until you have the flash of awareness that your first computer had 64 KB of RAM and now your phone has 64 GB of RAM. When your first hard drive was 20 or 30 MB, now what you consider to be disposable USB flash drives are at least 16 GB.

Knowledge management is the same way. We started with knowledge bases and limited full-text searching. Today, we have social network analysis and natural language processing sitting on top of our search capabilities to enhance the results we see. The mountain of explicit information has demanded – and received – better tooling, while, at the same time, we’ve recognized the need to enable tacit connections as well.

The Right, The Wrong, and the Maybe

Knowledge management made some big promises, and, in most organizations, those promises weren’t kept. Like a jilted lover, businesses started rejecting knowledge management as a waste of time and money, leading to the proclamation that knowledge management is dead. Of course, like all things, there’s some truth, some fiction, and some unavoidable lack of clarity.

Knowledge management is fundamentally an organizational change initiative. John Kotter and others in the organizational change space admit that 70% of organizational change initiatives fail. (See Leading Change and The Heart of Change). Simply based on the fact that changing the way organizations share knowledge is an organizational change initiative, we shouldn’t be surprised that there are some failures – in fact, a lot of failures. However, this message doesn’t sell well to leadership. Few leaders who are making the decision to do an organizational change initiative know of the failure rate for fear that they won’t fund the project. However, the tragedy in this is that many boards could influence their success if they knew what the risks were.

Knowledge is “squishy.” Some of it is explicit and much more is tacit. (See Lost Knowledge for more on tacit knowledge.) Knowledge exists in relation to everything else, and it seems like everything is always changing. As a result, the knowledge that we have one day may be completely or partially useless the next. For example, changing times made the knowledge of how to create prismatic glass for lighthouse Fresnel lenses no longer useful. As glass manufacturing changed, the knowledge no longer matched how glass was made and therefore became useless. We face similar challenges with knowledge every day. Knowledge becomes useless as some other part of the process changes.

So, while knowledge is a critical asset of an organization, and it is possible, to a certain degree, that we can manage it and encourage its use, in the end, knowledge isn’t stable and won’t be useful forever.

Best Practices

One of the topics that often arises when speaking about knowledge management is the desire to capture and replicate best practices. The idea is, of course, that if there’s a one best way of doing things then getting everyone to do it that way will generate better results. In theory, this is a great idea, but in practice, it doesn’t always work so well.

The first problem with best practices is importing them from one place to another. In the import, we face the high tariff of not invented here. Not invented here is the bias that people have towards using the good ideas of others instead of doing what they’ve done all along. In medicine, it shows up as the doctor using procedures and tools that research has shown to be ineffective, because they don’t want to trust the research more than their own experience – however flawed that may be. Doctors aren’t the only ones who believe their own experiences over the data.

Our marketing world, where claims aren’t verified or conditions aren’t clearly articulated, only exacerbates the problem of our trust that someone else’s practice is better and more effective than ours. Some believe that they’re special, and the statistics don’t apply to them or the environment, and they may be right.

All knowledge is conditional to the environment in which it operates. The same advice may be appropriate in some situations and completely disastrous in others. Consider the advice to water plants weekly. Completely appropriate for many plants. Disastrous for the cactus that expects very arid soil.

There is no best practice. There are only practices proven to work in certain circumstances – and the catch is that, in many cases, we can’t enumerate and identify what the conditions were that were critical to this success. Without that, we have little hope of finding and leveraging best practices.

The best we can do in knowledge management is articulate what has worked and what the subject matter experts believe were the salient factors and hope that the transparency creates a degree of trust that allows people to take the risk of using the practices. (See Trust=>Vulnerability=>Intimacy for more on the impact of vulnerability on trust.)

Strength of Relationships

In a world where uncertainty is king, “an organization’s ability to respond to the unpredictable is largely a function of the strength of its relationships.” That is, an organization’s adaptability is related to how well its people work together. (See The Black Swan for more on unpredictable events.) This means that effective knowledge management solutions must support, enhance, and extend relationships in the organization in a way that increases their strength. Knowledge is not, by itself, capable of protecting an organization from the storms of change. Only the people that bring life to the organization and their relationships can.

For a long time now, knowledge management professionals have known that it’s more than connecting people to content. Person-to-person connection is a huge part of how knowledge management works. The integration of social network analysis to search results indicates a growing awareness even on the content discoverability side of the power of relationships.

Story Telling

Development of stories and their power to motivate and connect people seems like an unlikely thing to cover in a book on knowledge management, but it’s critical to realize that knowledge is useful in our ability to connect with it at an emotional level. (See Wired for Story and Story Genius for more on how stories are written and how they impact us.)

The knowledge that we capture in our systems can be dry and without story. For some content, this works just fine, but for the kind of knowledge that transforms people and thereby the organization, a story – or stories – is required. Stories are constructed in a way that the reader becomes emotionally connected with the characters in the story. This connection drives the desire to find out more and creates the desire for learning that Malcolm Knowles and his colleagues say is important for adult learning. (See The Adult Learner for more.)

Communities and Participation

Communities can be an amazing thing to drive knowledge. However, this only works when the community has the kind of participation that works. Communities need to be of a certain size to work well. Too big, and they become unwieldy. Too small, and they don’t generate enough activity to sustain themselves. As a rule of thumb, only 10% of members will contribute and as little as 1% will be routinely engaged. That means that community sizes of a few hundred are an ideal minimum to keep the conversations happening.

For smaller organizations, this means that the entire organization may be in one community. For larger organizations, the challenge may be keeping the noise level low enough that people feel like the community is theirs.

Communities shouldn’t be organizationally-based but instead interest-based. There are teams and larger groupings that serve the needs to organize around “strictly business.” Communities allow for the cross-functional and cross-locational collaboration that drives innovation.

Reading Knowledge Management Matters may not move you to the inside ring of the knowledge management community. However, it may be a good first step.

Book Review-The Ethnographic Interview

I’m about as far away from an ethnographer as you can get. I live in the heart of the United States and in the same home for over 20 years. And yet, I use ethnographic interviewing in one form or another every single week. How can it be that I’m not embedding myself into new and strange cultures, and yet I value skills that resemble those needed by an ethnographer so deeply? The answer lies in the techniques and thinking that The Ethnographic Interview teaches and in my work world.

I came to The Ethnographic Interview by way of Peter Morville’s work, Intertwingled. He recommended it as a way to understand information architectures – and corporate cultures – more completely. I agree. All too often, the issues we have in understanding one another are about how our cultures differ, and no one has bothered to understand the unwritten meanings behind the words we use.

Requirements Gathering

Before I share some of James Spradley’s insights into ethnography, it’s important for me to cement the connection between what people do today and what ethnography is, so that it’s criticality can be fully understood. In IT, business analysts – by role or by title – seek to understand the foreign world of the business. They learn about logistics, manufacturing, marketing, accounting, and more in an effort to translate the needs of these groups for the developers and systems designers that will create IT systems to support them.

Even the experienced business analyst who knows the company and the department well must do their best to remove all of their assumptions and start fresh in understanding what the group is doing and what they need. While it’s technically impossible to remove all assumptions, because they are so good at hiding, the ethnographer’s task is to eliminate as many as possible and to test those that remain.

I wrote a course for Pluralsight some years ago, titled “Gathering Good Requirements for Developers,” where I teach a set of techniques designed to expose assumptions, test them, and make things feel more real and understandable on both sides.

The requirements gathering process, whether a part of agile design or traditional waterfall methodologies, is absolutely essential to being able to deliver what the business needs. The process of requirements gathering is ultimately a process of eliciting and understanding what the foreign culture is saying – even if that foreign culture is inside of your organization.

What is Ethnography?

An anthropologist is expected to be off in a foreign land eating strange food and spending most of their time wondering what people are saying and what the heck they’re doing so far from those they love. Ethnography is their principle work, which is the systematic study of the culture they’ve embedded themselves in. Put differently, the goal of ethnography is (according to Bronislaw Malinowski) “to grasp the native’s point of view, his relation to life, to realize his vision of his world.”

Simply stated, it’s learning from people. However, there are several nuances. First, ethnographers invite natives to teach them. They don’t assume that they know or can learn the culture without help. Second, there are components of the culture that aren’t ever directly expressed. For instance, in the United States, the phrase “How are you?” is typically a greeting. The typical response is “I’m doing well, and you?” It doesn’t convey a real interest in the other person – until and unless it’s followed with, “I mean, really, how are you?”

Dig Deeper

If there’s one thing I’ve found that is a problem with requirements gathering, information architecture, or just working with other people, it is that we don’t truly understand. We believe we understand. We might be using the same words, but we just aren’t 100% in alignment. That’s where training in ethnography is really helpful.

Ethnographers observe behavior but inquire about the meaning. They understand objects but seek to discover the meanings that the culture assigns to these objects. They record emotions but go beyond to discover the meaning of fear, anxiety, anger, and other feelings.

In short, they dig deeper. They verify their understanding to ensure that what they believe they understand is actually right. Consider for a moment death. It’s the punctuation mark at the end of life – every life. Yet, different cultures view death differently. Some cultures keep death hidden – as is the Western point of view – while others embrace or celebrate it. Some cultures believe in reincarnation and others in an afterlife. It’s the same event, but it’s culturally very, very different.

Gary Klein explains in Sources of Power that we all make models in our head, and it’s these models that drive our thinking. He also shares how painful it can be to get these models to surface. The models are tacit knowledge that cannot be expressed in explicit language. In fact, Lost Knowledge differentiates between tacit knowledge and what’s called “deep tacit knowledge,” which are mental models and cultural artifacts of thinking that are so ingrained the person literally can’t see them.

The person the ethnographer is talking to, the informant, needs promped to access the information they don’t know they know. A good ethnographer can tease out tacit knowledge from even the worst informants – but finding the right informants certainly makes it easier.

Indispensable Informants

If you follow agile development practices, you may notice that agile depends on a product owner who is intimately familiar with the business process that the software is being developed for. Lean Six Sigma speaks of getting to the gemba (Japanese for “the real place”) to really know what’s happening instead of just guessing. Sometimes this is also used to speak of the people who really know what’s going on. They do the real work.

The same concept applies to ethnographic research. You need someone who is encultured, really a part of what you’re studying. While the manager who once did the job that you’re looking to understand might be helpful, you’ll ideally get to the person who actually is still doing the work. The manager will – at some level, at least – have decided that they’re no longer a part of that group, and, because of that, they’ll lose some of their tacit knowledge about how things are done – and it will be changing underneath their knowledge anyway.

Obviously, your informant needs to not just be involved with the process currently, but they also need to have enough time. If you can’t get their time to allow them to teach you, you won’t learn much. Another key is that the person not be too analytical. As we’ll discuss shortly, it’s important that the informant be able to remain in their role of an encultured participant using their natural language rather than be performing translation for the ethnographer – as they’ll tend to do if they’re too analytical.

You can’t use even the best interviewing techniques in the world to extract information that no longer exists.

Interviews

The heart of ethnography isn’t writing the report. The heart of ethnography is the interviewing and discovery process. It’s more than just asking questions. It’s about how to develop a relationship and rapport that is helpful. The Heart and Soul of Change speaks of therapeutic alliance and how that is one of the best predictors of therapeutic success.

Tools like those described in Motivational Interviewing can be leveraged to help build rapport. Obviously, motivational interviewing is designed to motivate the other person. However, the process starts with engaging, including good tips to avoid judgement and other harmful statements that may make a productive relationship impossible.

For his part, Spradley in The Ethnographic Interview identifies the need for respect or rapport and provides a set of questions and a set of interviewing approaches that can lead to success.

Types of Questions

At a high level, ethnographic questions fall into three broad categories – descriptive, structural, and contrast questions. These questions allow the ethnographer to dip their toes into the water of understanding, structure their understanding, and understand terms with precision.

Descriptive Questions

Descriptive questions are by far the most voluminous questions that will be asked. They form the foundation of understanding what is in the informant’s world and how they use the objects in their world. Descriptive questions fall into the following categories:

  • Grand Tour Questions – These questions ask for a tour around the topic
    • Typical Grand Tour Questions – Asking for a typical situation in their environment
    • Specific Grand Tour Questions – Asking for a specific time and what happened
    • Guided Grand Tour Questions – Asking to see the specific things happening in an area of the informant’s environment
    • Task-Related Grand Tour Questions – Asking the informant to explain a specific task that they do and how they do it
  • Mini-Tour Questions – Mini-tour questions are the same structure as grand questions but focused on a smaller area of the informant’s experience.
    • Typical Mini-Tour Questions
    • Specific Mini-Tour Questions
    • Guided Mini-Tour Questions
    • Task-Related Mini-Tour Questions
  • Example Questions – Asking for a specific example of something that the informant has answered in general
  • Experience Questions – Asking for experiences that the informant might have found interesting, relevant, or noteworthy
  • Native-Language Questions – Asking how the informant would interact with someone else from the culture – in the language that they use
    • Direct Language Questions – Asking what language they use to refer to something in their environment
    • Hypothetical-Interaction Questions – Asking questions about hypothetical situations that the ethnographer creates
    • Typical-Sentence Questions – Asking what kind of sentences that would be used with a phrase

Descriptive questions allow ethnographers to amass a large amount of information, but that information is unstructured and unconnected. While it’s necessary to spend some time in this space, after a while, it will become necessary to seek to understand how the informant organizes this information.

Structural Questions

As important as building a vocabulary is, understanding the relationships between various terms is more illuminating to the structural processes that the informant uses to organize their world. We use symbols to represent things, and these symbols can be categories that contain other symbols. This is a traditional hierarchical taxonomy like one might find when doing an information architecture (see Organising Knowledge, How to Make Sense of Any Mess, and The Accidental Taxonomist).

In truth, there are many different kinds of ways that symbols can be grouped into categories, and understanding this structure is what makes the understanding of a culture rich. Spradley proposes that there are a set of common semantic relationships that seem to occur over and over again:

1. Strict inclusion X is a kind of Y
2. Spatial X is a place in Y, X is a part of Y
3. Cause-effect X is a result of Y, X is a cause of Y
4. Rationale X is a reason for doing Y
5. Location for action X is a place for doing Y
6. Function X is used for Y
7. Means-end X is a way to do Y
8. Sequence X is a step (stage) in Y
9. Attribution X is an attribute (characteristic) of Y

Spradley proposes five kinds of structural questions designed to expose the semantic relationships of terms:

  1. Verification Questions – Asking for verification of a domain – or relationship between a set of terms
    1. Domain Verification Questions – Asking whether there are different kinds of a term that the informant has shared
    2. Included Term Verification Questions – Asking whether a term is in a relationship with another term
    3. Semantic Relationship Verification Questions – Asking whether there is a kind of term that relates other terms or if two terms would fit together in a sentence or relationship
    4. Native-Language Verification Questions – Asking whether the words spoken from the informant to the ethnographer are the words that would be used when speaking to a colleague
  2. Cover Term Questions – Asking if there are different types of a particular term
  3. Included Term Questions – Asking if a term or set of terms belong to another term
  4. Substitution Frame Questions – Asking if there are any alternative terms that could be used in the sentence that an informant has spoken
  5. Card Sorting Structural Questions – Asking informants to organize terms written on cards into categories and by relatedness. This is similar to an information architecture card sorting exercise. (See my post and video about Card Sorting for more.)

Descriptive questions will be interspersed with structural questions to prevent monotony and to allow the ethnographer to fill in gaps in their knowledge. Though structural questions help provide a framework to how terms relate, the relationship strength between terms isn’t always transparent. That’s why contrast questions are used to refine the understanding of what the strength of the relationship is between terms.

Contrast Questions

Sometimes you can’t see differences in the abstract. For instance, our brains automatically adapt to changing light and convert something that may look blueish or pinkish to white, because we know something (like paper) should be white, even when the current lighting makes it look abnormally blue or pink. So, too, can the hidden differences between terms be obscured until you put them right next to each other. That’s what contrast questions do. They put different terms side-by-side, so they can be easily compared.

The kinds of contrast questions are:

  1. Contrast Verification Questions – Asking to confirm or disconfirm a difference in terms
  2. Directed Contrast Questions – Asking about a known characteristic of a term and how other terms might contrast on that characteristic
  3. Dyadic Contrast Questions – Asking the informant to identify the differences between two terms
  4. Triadic Contrast Questions – Asking the informant to identify which one of three terms is least like the other two
  5. Contrast Set Sorting Questions – Asking the informant to contrast an entire set of terms at the same time
  6. Twenty Questions Game – The ethnographer selects a term from a set and the informant asks a set of yes/no questions of the ethnographer until they discover the term. This highlights the hidden ways that informants distinguish terms. (This is similar to techniques like Innovation Games, where the games are designed to reveal hidden meanings.)
  7. Rating Questions – Asking questions about the relative values placed on different terms – along dimensions like easiest/most difficult and least/most interesting, least/most desirable, etc.

The sheer number of types of questions can seem overwhelming at first. However, many of these forms flow automatically if you develop a genuine interest in the informant and their culture. Still, sometimes it’s hard to try to learn a new language and think about what’s the next question that you need to ask to keep the conversation moving.

Multiple Languages

In the case of an anthropologist who is working with a brand new culture, it could be that they’re learning a whole new language – literally. However, in most cases, it’s not that the language is completely different and new to the ethnographer. In most cases, it’s the use of the terms that are different. Just experiencing the difference between UK English and American English can leave someone a bit confused. A rubber in England is an eraser in the US, and a cigarette in the US is a fag in the UK. While both are English, the meaning and expectations of the word are quite different.

We often forget how we speak differently in a profession. A lexicon – special language – develops around industries that aren’t a part of the general consciousness. It’s the ethnographer’s job to discover not only that lexicon but also what the words mean to the rest of us.

Who Should Translate, and When?

When there are multiple languages, there is always the need to translate from one language to another. However, who does that translation – and when is the translation done? Informants, in their desire to be helpful, are likely to try to translate the information of their culture into terms that the ethnographer will understand. While the intent is helpful, the result is that the ethnographer doesn’t get to understand that aspect of the culture.

So, while translation is necessary, it’s best to continue to discourage the informant from being the one who is doing the translation. The ethnographer can leave their notes in native language and then translate later. This also allows them to validate information with structural and contrast questions. Sometimes, it’s this review that reveals some underlying themes of the culture.

Themes

In most cultures, there’s a set of recurring themes that appear. It isn’t explicit or stated, but there are those sacred cows that everyone worships that shapes the way the organization thinks. An entrepreneurial company has agility or velocity at the heart of the way that they organize their thoughts. A brand-focused company may be inherently focused on status or image. While these values aren’t typically articulated, they’re assumed, and they shape the way that the organization thinks – about everything.

By having the opportunity to review and rework translations, these themes begin to emerge. The semantic relationships appear over and over again until it becomes apparent that they’re not specific ways of organizing a topic but are instead a way of organizing everything.

Depth

One of the challenges that I often see in requirements is that the business analyst doesn’t always spend the time drilling into the details and verifying understanding in a way that results in requirements that fully express the needs of the business and how they do work. The ethnographic process – including the variety of questions – is one way to combat this challenge. It’s possible to leverage the ethnographic process to more deeply understand what is happening and how the systems are expected to help.

While I may be far from the fields of a foreign land, speaking to people whose language I don’t speak, I often move from industry to industry and company to company, learning their languages and the way that they think about the world. The Ethnographic Interview is, therefore, a useful tool for helping me get a better understanding and better requirements.

Book Review-Intertwingled: Information Changes Everything

It might seem odd that one of the forefathers of information architecture would proclaim that everything is intertwingled and thus hard to force down into specific categories, but that’s what Peter Morville is saying. You can’t separate the parts from the whole. Intertwingled: Information Changes Everything is the start of a journey in realizing that, no matter how hard you try to cut apart, partition, chunk, and dissect, most things can’t be broken down without losing something. That won’t stop us from doing it, because we need to do it to simplify our world. However, it may change the way that we approach the process.

Thinking in Systems

The vehicle in our journey is the ability to see everything as a system. There’s an old Tootsie Roll commercial where a jingle is played that ends with “Tootsie Rolls are all I see.” It’s a catchy tune that was designed to get people thinking about Tootsie Rolls, so they couldn’t forget about them. However, there’s some truth in life to this commercial. Once you see something, you can’t un-see it. It’s impossible to not know what a hippopotamus looks like after you’ve seen one. If you’ve learned to see things in systems, it’s hard to un-see them.

For Morville – and for me – everything is a system. It’s a system that can be manipulated, changed, and adapted often with unintended and inexplicable outcomes. Whether you realize that steel axe heads degenerated some aboriginal societies (as in Diffusion of Innovations), you got caught up in Peter Singe’s wildly popular The Fifth Discipline, or you stumbled across the posthumously published book Thinking in Systems by Donella Meadows, once you’ve been infected with the awareness that everything is a system of stocks and flows (inbound and outbound), it’s hard to not see it any longer.

However, just because you can see things in systems doesn’t mean you can predict outcomes. Gary Klein recognized that fire commanders build mental models of the fire where they could predict what is happening. Their Sources of Power may not have been consciously known to them, but they did know when one of their expectations generated by the model was violated, and it was time to go back to the drawing board and try to learn how the system was really functioning.

Fire commanders were strikingly good at predicting how the fires they were used to would behave. Their mental models worked. However, knowing how fires worked made them no better at predictions in other areas of their life. They didn’t even realize that they were seeing the fire as a system and simulating its facets – they were blind to the fact that they were even seeing the fire as a system. (Seeing What Others Don’t is another work by Klein which helps to explain how we develop the insights that fuel our mental models.)

Wireframes and Storyboards

Wireframes are a stock tool of a user experience designer and of the information architect. These wireframes form the boxes. They’re the pictures of what’s being built. However, what Morville and others have discovered is that it’s what happens between the boxes that is truly interesting. In Understanding Comics, Scott McCloud explains that the story in a comic book happens in the gutter between the frames. In effect, it’s the arrows between the boxes that gives the story its power. It’s what the user fills in themselves that helps the most.

Story is about connecting one frame with the next. That’s why storyboards are an even more powerful tool for designers than the venerable wire frame. Storyboards, which are often used in designing feature films (see Creativity, Inc.), connect the dots and reveal what’s hidden in plain sight in the way of the wireframes.

Rewinding to Agility

One of the criticisms laid at the steps of information architecture is that it reeks of “big design up front.” This is just the sort of waterfall-based logic that agile development teams launch scud missiles at. The problem is that too few practitioners of agile have read the Agile Manifesto or understand what information architecture is. They argue against documentation when the original founders of the agile movement were more concerned with ceremony and eliminating the waste caused by it. They didn’t want to eliminate documentation. They were focused on building and adapting instead of unnecessary documentation and unnecessary rigidity. However, there’s nothing in agile that argues against understanding what you’re doing. In fact, agile is focused on learning and understanding. The argument that agile levels against traditional waterfall (one-iteration) approaches is in believing in the planning fallacy – that you can plan through things ahead of time. (See How Will You Measure Your Life? for more.)

In fact, agile aligns perfectly with systems thinking, where you do something, observe the results, and then do something else. Sometimes you reverse (or try to reverse) the latest change. Sometimes you head off in a different direction. Ultimately, you’re always making small changes to see how things react, then adapting.

Information architecture – like software development – can take on characteristics of agile or traditional waterfall development. It all depends upon whose hands the tools are wielding the tools. There are some activities that require the development of non-intervention understanding. However, this is not always the case. Good information architecture recognizes that you won’t get it right the first time – or at one time. The needs of users to access content shift as attitudes and options change.

Memories

Our memories are fickle things. Our memories aren’t like video recorders accurately storing what happened. Instead, the memories become encoded and reduced into fragments and concepts that we can rearrange when we try to retrieve them. (See Science in Pseudo Science in Clinical Psychology for more on the fallacy of our memories.) Our memories don’t encode words. Instead, we encode concepts and ideas. When we retrieve the memory, we reconstruct the concepts using the words that we have in our current vocabulary.

If you don’t believe me, try to recall a conversation that you had when you were 8-10 years old. Recite the dialogue as you remember it into a recorder. Then go back and review the recording. You’re likely to find that your words weren’t words that you had in your vocabulary back then. Your memories were reconstructed with the help of your current vocabulary.

This simple trick reveals one of the ways that our brain tries to fool us. (For more on the way that our brain fools us see Incognito.)

Words

Even though we don’t remember words directly, words are deeply embedded into the way that we think. If you feed people negative experiences, you’ll get back negative words. One of the favorite parlor tricks of pop psychology is to listen for the Freudian slip. That is, when the truthful thing slips out from underneath the weight of polite society. Chris Argyris has an exercise of left and right columns, where the right column is what was actually said, and the left column is what was thought or felt. This exercise exposes how the words we use aren’t the words that we mean, and how these words can lead us down unproductive paths. (See Organizational Traps for more on the two-column method.)

Words are the way that we create the mental framing that we use for a problem. That’s why the words we use in our taxonomies are so important. They can conjure up the right ideas or ones that don’t connect with our audiences.

Maps and Territories

They’re called the badlands. Technically, this is a geological term referring to the erosion of clay-rich soil and softer sedentary rocks. However, the connotations of the term “bad” subtly influence millions of people to avoid these beautiful geological features. In this way, labeling an area as “badlands” on a map influences the visitors at Badlands National Park. The hidden connotations of the words we use shape how people see what we’re talking about, for better or for worse.

If I define a category for furniture, I’ll shape the way that people think about the category. They’ll picture in their mind what furniture means. They’ll see a chair, a couch, or a table. (At least 90% of the groups I do this exercise with say these three items.) If I’m looking for a rug, I won’t automatically look in furniture. Similarly, I’ll have a problem looking for a lamp. Is it in electrical or furniture? It has both properties.

So in one sense, a map isn’t a territory – but in another sense, it is, because it shapes that way that we think about the territory.

Multiplicity

There is an information architecture problem caused when the categorization labels make it difficult for consumers to decide which path they should follow. One approach to solving this is to use polyhierarchy – having one category included in multiple places in the hierarchy – said differently, having two parent categories for the same category.

While this is a necessary strategy for providing solutions to platypus-type problems (which don’t fit neatly in any one category), it can be an unnecessary crutch used because of poor category selection. Like other tools in the information architecture toolbox, it’s important to know when to use it – and when not to.

The best strategy for managing the problem of things that fall within multiple categories of an existing taxonomy is to use multiple taxonomies. That is, instead of creating one and only one hierarchy of terms, multiple taxonomies are used, each with its own set of terms that can be selected when appropriate – and omitted when not necessary.

Intertwingled

With all we know about how the mind works and how we categorize, we still don’t know enough, and we still can’t see enough. Breaking things down into their components so that we can learn about them is a good strategy. (See Efficiency in Learning and The Art of Explanation for more on learning approaches.) However, it’s not a good strategy if we fail to learn how those pieces fit together, how they work together, and how they break the rules when they’re connected to one another. That’s what makes things Intertwingled. They become that way because, when you put the pieces back together, you don’t always get exactly what you would expect. If you’re ready to start your own journey, perhaps it’s time to learn to see the world Intertwingled.

Recent Posts

Public Speaking