Schlagwort-Archive: Outcomes

The content – the second gateway into the mind of the audience

It is a long way to transfer thoughts out of the void into a consistent concept. The coherence of the aspects is insufficient to convey it to the audience. With the right understanding of the traits and features of the interest group, you can introduce contents more clearly. The content is thereby the second gateway into the mind of the audience.

Inhalt

The following aspects make contents more understandable and effective.

  • Goal
    An elaborated concept contains many specifics. The list of details provides meaningful insights only to a few people. Therefore it is favorable to define a goal that you intend to reach with the contents – to convey an overview; to describe a certain area; to create curiosity for a topic. A presentation or a text with a clear goal facilitates better contents for a broad group of people.
  • Target group reference
    Considering the target group characteristics provides a framework for the preparation of contents. Experts are used to navigate through a huge amount of information with table of contents or indexes, in order to get to useful contents. Non-experts need a simple structure and a comprehensible storyline that is not hidden behind technical terms. For this reason you need a concrete idea of the required information of the target group – Which interests exist? On which level of detail? Technical or easy to understand? Objective or emotional? The addressees better understand the message with the appropriate “flight height”.
  • Core message
    With the clear goal and the audience in mind the question about the core messages that you would like to convey arises. In any case you should limit yourself to 5plusminus2 messages, because it will be even difficult for experts to process more chunks. The contents are formulated around these messages. They provide several aspects – the purpose of the message (e.g. conveying facts, requesting something, expressing the own emotions) and the core elements of the message (e.g. objects, procedures, insights). Eventually, the audience can only remember those aspects that they are able to understand and process.
  • Facts and opinions
    Some information is generally well-known and verifiable. These are the facts. Other information is subjective and cannot be proven, but you are convinced of it. These are the opinions. The target group might believe these opinions or not. It is important to clearly differentiate between facts and opinions. Thus, the probability increases that the desired aspects stick to the target group, whether these are facts or opinions.
  • Procedure
    The individual data is not hanging timeless in the space. There is always a logical sequence – the discovery process or the dramatic composition of the story. The explicit description of these aspects enables the audience to better understand the contents and to better remember them. If you would like to create confusion or to produce suspense, it is helpful to create spontaneous, unforeseeable time leaps. It is better for the transfer of knowledge to follow the natural development of the topic, since that way the target group can better remember contents.
  • Outcomes
    The outcomes that were compiled are particularly The more concrete and useful the results are, the easier the attendants internalize the contents. In the end the drawn conclusions, the actual experiences and results are a principal reason for the audience to admit the contents.

Bottom line: Contents are better processed by the target group with the elements above. Discussions get the relevant information in order to get to a productive discourse. At the same time the target audience can better remember contents. Thus, the content is the second gateway into the mind of the audience.

Benefit through Big Data

The quantity of data doubles every two years. Until today over three Zettabytes were produced – i.e. 3.000.000.000.000.000.000.000 bytes. Until 2020 according to IDC http://ow.ly/Ao5v7 the volume of data will rise to 40 Zettabytes. It is understandable that the users frighteningly shy away from this flooding of information. The IT-industry that promotes this flood developed solutions for calming. They suggest that these very large, diverse and fast developing amounts of data can be controlled. These products are subsumed under the generic term Big Data. Relationships, meanings and samples can be evaluated with special programs. From now on the experts, who gain the value-add for the enterprise, only need to know, what has to be done, in order to produce the benefits through Big Data.

BigData

The new feature is the fact that not only internal amounts of data can be interpreted, but all attainable data, independently of its format. A prerequisite for the benefit is the capability of the business office to formulate the right questions right. It follows the technical realization by the IT department. IT returns thereby to its old function of evaluating data. The procedure consists of three steps.

  • Formulating questions
    The specialists have to specify their needs for information. First questions are formulated (In which regions, what products? Own ones and those of competitors?). Then sources are determined as well as the time and the period for the evaluation.
  • Processing the data
    The experts of the IT-department, the so-called Data Scientists, take over the questions and translate them into the technical IT-specs – attainable databases, safety-relevant aspects, data formats, compatibility, etc. The programs that produce the results are derived from them by using SQL, NoSQL, Analytics, visualizations, etc. The outcomes are eventually compiled and delivered to the business experts. This procedure resembles the early data processing and its batch programs. However, a close co-operation between business experts and IT-department takes place in each phase. Additionally, the possibilities of evaluation evolved over the years significantly.
  • Using the outcomes
    It is crucial to interpret the results correctly. For this purpose it is important to produce reports that provide correct, up-to-date, comparable, understandable and comprehensible results. Measures are then developed on this basis.

Over time, this procedure creates a lot of results that will frequently cover different time frames and pursue diverse intentions. Due to the fact that the data sources also change with great speed, you have to adapt willy-nilly to the fact that reports have only a short half-life. For the users this means that

  • they must always be open for new insights,
  • outcomes have a short durability, and
  • measures have to be made in shorter intervals as well as
  • results have to be forgotten quicker,

in order to create the place for new findings. Numbers, data and facts receive a new weight, since the qualified interpretation will regularly change more with difficulty as well as the topicality and interaction.

Bottom line: Big Data permits the processing of available internal and external data. This requires an appropriate IT-infrastructure and, above all, the ability to formulate and translate questions clearly into IT-activities. Thus the importance of the IT-department and the efficient IT-infrastructure rises. The benefit that results from Big Data requires new way of dealing with the ephemerally insights that come from numbers, data and facts. The basis for decision makings have to be renewed regularly.