DD
MM
YYYY

PAGES

DD
MM
YYYY

spot_img

PAGES

Home Blog Page 7669

What Adam Smith’s Classic, “Wealth Of Nations” Must Teach Nations About Technology

0

In 1776, Scottish economist and philosopher, Adam Smith wrote the masterpiece, ‘The Wealth of Nations’- actually ‘An Inquiry into the Nature and Causes of the Wealth of Nations”. By coincidence, the United States Declaration of Independence was adopted the same year, making the American colonies independent and thus no longer a part of the British Empire.

America has since evolved to dominate the old British Empire in virtually every aspect of human endeavors, except perhaps, social welfare. The Yankees figuratively were discipled by Dr. Smith who believed in free market and made his argument that ‘capitalism’ will benefit mankind than any other economic structure. He laid this foundation at the onset of industrial revolution and provided the basics for modern economics.

Smith made his case about the ‘invisible hand’ and why monopoly and undue and unfettered government regulations or interference in market and industry must be discouraged. He was of the opinion that prudent allocation of resources cannot happen when states dominate and over interfere.

In that old time, America farmers could grow cotton, but would not process it. It has to be sent to England where it would later be imported into U.S as a finished product. Understanding that this decision was not due to lack of processing ability, you will appreciate Smith’s argument that market must be free.

His theses were clear and were very influential; they provided the same level of fulcrum to Economics as Isaac Newton’s Mathematica Prinicipia to Physics. Or in modern times, Bill Gates’ Windows to the information economy.

While reading Smith’s book and understanding the time frame it was written, one cannot but appreciate the intellectual rigor in that piece. Before technology was penetrated in en mass across the regions of the world, he noted that all nations could compete at par in agricultural productivity.  The reason was absence of division of labor in any subsistence farming system in the world. A farmer does everything in the farm and is not an expert in most.

Discounting fertile land, rain and other factors that could help farmers, all the farmers, from Africa to plantations in Alabama, the level of productivity was similar.  Why? No specialization was employed in farming business at the time.

Fast track forward when the industrial revolution set forth. The British Empire became an engine of wealth creation through automation. It was a quintessential period of unrivalled human productivity which resulted to enormous wealth created in the empire. Technology not only helped speed process execution, it helped in division of labor.

Interestingly, Dr Smith had noted that except agriculture where productivity was flat because of lack of division of labor, other industries were doing just fine. And in those industries, there were organized structures which enabled division of labor. For instance in the construction industry, there were bricklayers, carpenters, painters, and so on; but a farmer was a farmer.

As you read through Wealth of Nations and observe the 21st century, it becomes evident that technology was so influential in the last few centuries. It has changed our structures and created a new business adaptation rules like outsourcing which is indeed a new breed of division of labor.

From accumulation of stock and pricing, as explained by Dr. Smith, we see today a world where technology is shaping everything in very fundamental ways for wealth creation. In this era, it has become technology as technology translates to wealth. So, nations that focus on creating, diffusing and penetrating technology will do well.

Why? It is about national technology DNA. The more passionate and innovative nations are triumphing at the global business scene. Give me Japan and I will give you electronics. Talk about United States, I will share biotechnology and pharmaceutical technologies, and indeed every major technology. Give me China, and I will give you green technologies.

So, as nations continue to compete on the technology paradigm, we see at the highest level of success measurement an embodiment captured by technology capability. When nations are understood from the lens of their Technology Readiness Index, Knowledge Economic Index, we see that countries have become technology competing nodes. In some really poor countries with no (effectual) technology, they do not have a node and are unplugged in the sphere of global wealth creation.

Simply, it will be difficult to separate the health of any modern economy from its technology. It goes beyond the wealth of that nation to its survivability. The most advanced nations are the technology juggernauts while the least developing economics barely record any technology penetration impact. For the latter, it is like still living in the pre-industrial age Dr. Smith discussed on agriculture and division of labor where processes were inefficient.

Perhaps, this explains the efficiency in developed world in both the public and private arenas. The more technologies they diffuse, the more productive they become. In other words, show me the technology and I will tell you where the nation stands in the league of countries. Interestingly, the invention of steam engine changed the world and powered the industrial revolution. The invention of transistor transformed the 20th century and is fuelling the new innovation century.

It seems that major scientific breakthroughs bring major great countries. Let me emphasize here that some old kingdoms that ruled the world such as the old Babylon, Roman Empire, and Pharaoh’s Egypt; there have been associated knowledge base that put them ahead. You cannot disassociate good crop production in River Nile to the mastery of Egyptians in inventing some sections of geometry for farming. Some of the old wars had been won by developing constructs that enabled efficient transportation of soldiers to battleground.  There was science and nations were winning by using that knowledge.

In conclusion, the world has been living on technology and it is indeed defining our competitive space. As nations compete, it is technology that shapes the world with wealth as the major byproducts, in some cases. I make this case because some of the best technologies had been invented for non-wealth reasons (yes, directly). Examples include Internet and radar technologies which have created wealth and spurred commercial innovations but have military origins.

There could not be any more powerful way of examining national competitiveness than understanding the technology of nations. Yes, wealth has since morphed to technology and all competitions and wealth creation could as well be seen from technology viewpoint.  And in this piece, I aptly replace Dr. Smith’s ‘wealth’ with ‘technology’ to have The Technology of Nations.

What Are Embedded Systems? Why Do We Need Embedded Systems?

2
Masters

Embedded systems are small, fast, and very powerful tools, gadgets and equipments which have become part of our everyday life. They are those computer systems that do not look like computer systems to the everyday user. They form a part of a larger system or product, part of anything, from mobile phones to medical devices, from agricultural farming tools to manufacturing equipments. An embedded system is a micro-processor based system that is built to control a function or range of functions and is not designed to be used by the user in the same way that a personal computer (PC) is (Heath, 2003).

It is a combination of computer hardware and software, and perhaps additional mechanical or other parts, designed to perform a dedicated function (Netrino, 2011). In some cases, embedded systems are part of a larger system or product, as in the case of an antilock braking system in a car. Although the user can make choices concerning the functionality, he cannot change the functionality of the system by adding or replacing software as is possible with the PC.

In a PC, you can change functionality from word processing to games and then to mathematical computation by simply changing the software application but this is not possible in embedded systems. An embedded system is designed to perform one or a few dedicated and/or specific functions but with choices and different options (Michael, 2007; Heath 2003).

Fig1 and Fig2 are examples of embedded systems. Today, more microprocessors around the globe are used in embedded systems rather than in PCs. Those already large numbers are increasing at a phenomenal rate as the devices that surround us in our everyday lives become smarter. This is a consequence of an insatiable drive towards having control over devices and access to data anywhere, anytime. Needless to say we prefer them connected – wired or wireless.

Fig1: Picture of the internals of an ADSL modem/router. (A modern example of an embedded system. Labeled parts include a  (4), RAM (6), and flash memory (7)).

Fig2: PC Engines’ ALIX.1C Mini-ITX embedded board with an x86 AMD Geode LX 800 together with Compact Flash, miniPCI and PCI slots, 44-pin IDE interface, audio, USB and 256MB RAM

 

Why do we need embedded systems?

The first reason why we need embedded systems is because general-purpose computers, like PCs, would be far too costly for the majority of products that incorporate some form of embedded system technology (Christoffer, 2006). Another reason why we need embedded systems is because general-purpose solution might also fail to meet a number of functional or performance requirements such as constraints in power-consumption, size-limitations, reliability or real-time performance etc.

The digital revolution, started decades ago, has reached a stage that we cannot conduct our normal modern daily lives without this technology. Indeed, it is safe to say that we already own at least one piece of equipment, which contains a processor, whether it is a phone, a television, an automatic washing machine or an MP3 player.

The colossal growth of processing power in small packages has fuelled the digital revolution. All sectors of the economy have been influenced by the digital revolution and the industry has experienced tremendous developments in all aspects of engineering disciplines (Bruce, 2011).

 

It’s A Small, Small World – Nanotechnology. Why Should Investors Care?

2

Editor’s Note: This is an exclusive article for Tekedia that we hope will help our readers understand the processes that take place as ideas evolve into investments. Mr. Aoaeh wrote this piece while an MBA student in New York Stern Business School. Though nanotechnology may not be applicable in your local market in Africa, the steps in this article are universal.

Introduction

I became exposed to nanotechnology during my days as an undergraduate student at Connecticut College, in New London, Connecticut. I pursued a double major in Physics and Mathematics, and had the good fortune of working as a research laboratory assistant in the Tunable Semiconductor Diode-Laser Spectroscopy lab, which is run by Professor Arlan W. Mantz, Oakes Ames Professor of Physics, and erstwhile chair of the Physics Department. My involvement with the lab spanned three years, and that experience played a critical role in my education.

Executive Summary: Investors ought to become aware of the opportunities and risks presented by the budding field of nanotechnology. 

What is it?

The term nanotechnology refers to a group of scientific processes that enable products to be manufactured by the manipulation of matter at the molecular level – at the nanoscale. One nanometer represents a length of 10-9 meters – one billionth of a meter[i]. Nanotechnology enables the manipulation of matter at or below dimensions of 100 nanometers. Nanotechnology draws from a multitude of scientific disciplines – physics, chemistry, materials science, computer science, biology, electrical engineering, environmental science, radiology and other areas of applied science and technology.

There are two major approaches to manufacturing at the nanoscale;

  • In the “bottom-up” approach, nanoscale materials and devices assemble themselves from molecular components through molecular recognition – small devices are assembled from small components.
  • In the “top-down” approach materials and devices are developed without the manipulation of individual molecules – small devices are assembled from larger components.

Where is Activity Concentrated?

Research into nanotechnology and its applications is growing rapidly around the world, and many emerging market economies are sparing no effort in developing their own research capacity in nanotechnology.

  • Naturally, the U.S., Japan, Western Europe, Australia and Canada hold an advantage, in the short term.
  • China and India have made significant progress in establishing a foundation on which to build further capability in nanotechnology – A 2004 listing puts them among the top 10 nations worldwide for peer-reviewed articles in nanotechnology[ii].
  • South Africa, Chile, Mexico, Argentina, The Philippines, Thailand, Taiwan, The Czech Republic, Costa Rica, Romania, Russia and Saudi Arabia have each committed relatively significant resources to developing self-sufficient local nanotechnology industries.

Why should investors care?

Fundamentally, investors should pay attention to nanotechnology because of its high potential to spawn numerous “disruptive technologies.” Nanoscale materials and devices promise to be;

  • Cheaper to produce,
  • Higher performing,
  • Longer lasting, and
  • More convenient to use in a broad array of applications.

This means that processes that fail to provide results comparable to those available through nanotechnology will become obsolete rather quickly, once an alternative nanoscale process has been perfected. In addition, companies that fail to embrace and apply nanotechnology could face rapid decline if their competitors adopt the technology successfully.

The United States Government has maintained its commitment to fostering U.S. leadership and dominance in the emerging fields of nanoscale science. In its 2006 budget, the National Nanotechnology Initiative, a multi-agency U.S. Government program, requested $1.05 Billion for nanotechnology R&D across the Federal Government[iii]. That amount reflects an increase from the $464 Million spent on nanotechnology by the Federal Government in 2001.

Applications of Nanotechnology

Nanotechnology’s promise to revolutionize the world we live in spans almost every aspect of human endeavor. Today, nanotechnology is applied in as many as 200 consumer products.[iv]

  • Carbon nanotubes can be used to fabricate stronger, lighter materials for use in automobile bodies, for example.
  •  Researchers at Stanford University have killed cancer cells using heated nanotubes, while EndoBionics, a US firm, has developed the MicroSyringe for injecting drugs into the heart.[v] Other applications in medicine and biotechnology are in the pipeline.
  • MagForce Technologies, a Berlin based company has developed iron-oxide particles that it coats with a compound that is a nutrient for tumor cells. Once the tumor cells ingest these particles, an external magnetic field causes the iron-oxide particles to vibrate rapidly. The vibrations kill the tumor cells, which the body then eliminates naturally.[vi]
  • Cosmetics companies are actively engaged in the exploration of nanotechnology as a source of enhanced products. For example, to produce cosmetics that can be absorbed more easily through human skin and that exhibit longer lasting properties.
  • Advanced Micro Devices and Intel both use nanotechnology in the design and production of their most recent line of computer chips. Hewlett Packard is also actively involved in research to vastly enhance storage capacity on its computers and other electronic devices through nanoscale applications.
  • Nanotechnology has been applied in the garment industry to produce stain resistant fabrics, for example.
  • Nanotechnology companies in the developing world are pursuing solutions to problems peculiar to the developing world – for example, an Indian company is working on a prototype kit for diagnosing tuberculosis. There is great potential for the application of nanotechnology to agriculture.
  • Nanostructures are used in the production of Organic Light Emitting Diode (OLED) screens. OLED displays are replacing Liquid Crystal Displays in consumer products such as mobile phones, digital media products, and computer monitors

Threats

In spite of its promise, nanotechnology faces threats that could impede its advance. Among these[vii], [viii];

  • It is not yet clear how nanotechnology will affect the health of workers in industries in which it is applied. For example, how should we assess exposure to nanomaterials? How should we measure the toxicity of nanomaterials?
  • Public agencies and private organizations do not have a clear sense of how further progress in nanotechnology will affect the environment, or of the public safety issues that will accompany an expanded use of nanotechnology in industrial, medical and consumer applications. For example, what factors should risk-focused research be based on, and how should we go about creating prediction models to gauge the potential impact of nanomaterials?
  • The complexity of the science that is integral to nanotechnology makes it a very difficult area to regulate. It is likely that firms involved in the pursuit of nanoscale applications in medicine and pharmaceutics will face long delays in obtaining regulatory approval for the wide scale use of their products.
  • The complexity of nanotech-related patents has led to a backlog at the U.S. Patent Office. It now takes 4 years, on average, to process a patent application – about double the waiting time in 2004. This could lead to situations in which a firm’s intellectual property becomes public before it comes under patent protection, thus eroding any competitive advantages that it had over its rivals.[ix]
  • It is not yet clear how society can protect itself from the abuse of nanotechnology. The public sector needs to collaborate with the private sector in developing protective mechanisms to guard against “accidents and abuses” of the capabilities of nanoscale processes and materials.

A note to would be investors

The average investor must remain keenly aware that firms involved in nanotechnology will have to assign significant resources to research and development. There is no reliable means of predicting the ultimate outcome of such activities, and the probability that any firm can maintain an enduring edge over its competitors is small. Investors should expect the mantle of leadership in innovation to change with a relatively high frequency. As such, pure-play nanotechnology firms will need to pay critical attention to means of sustaining market dominance that go beyond core competence in the science of nanotechnology.

Lux Research estimates that revenues from products using nanotechnology will increase from $13 Billion in 2004 to $2.6 Trillion in 2014. The 2014 estimate represents approximately 15% of global manufacturing output.[x].

In 2005, Lux Research and PowerShares Capital Management launched a nanotech ETF – The PowerShares Lux Nanotech Portfolio (PXN). In addition, Lux Research measures the performance of publicly traded companies in the area of nanotechnology through the Lux Nanotech IndexTM, a modified equal dollar weighted index of 26 companies. The companies in this index earn profits by utilizing nanotechnology at various stages of a nanotechnology value chain[xi];

  • Nanotools – Hardware and Software used to manipulate matter at the nanoscale.
  • Nanomaterials – Nanoscale structures in an unprocessed state.
  • Nanointermediates – Intermediate products that exhibit the features of matter at the nanoscale.
  • Nano-enabled Products – Finished goods that incorporate nanotechnology.

Companies in the index are further classified as

  • Nanotech Specialists, or
  • End-Use Incumbents.

Investors must note that the investment characteristics of Nanotech Specialists are likely to differ markedly from those of End-Use Incumbents. The end-use incumbents that are part of this index include 3M, GE, Toyota, IBM, Intel Hewlett-Packard, BASF, Du Pont, and Air Products & Chemicals. Because these companies have large, well-established and significant operations in arenas that do not rely heavily on nanotechnology, investors can expect them to achieve financial results that are only moderately volatile. In contrast the financial performance of nanotech specialists will exhibit highly volatile swings, because;

  • With the exception of companies in the “picks and shovels” segment of nanotechnology, much of the work that many nanotech specialists engage in is still in the “trial and error” phase, and
  • There is no reliable means of predicting the results that heavy investment in R&D will yield.

Finally, it is likely that financial valuations of nanotech firms will fail to capture the true value of the intangible assets that provide the bedrock of each company’s ability to sustain innovation, create economic value, and protect its competitive advantage. If nanotechnology is truly the way of the future, then investors must embrace that future with enthusiasm that is layered with caution by;

  • Performing an extra amount of due diligence before committing significant funds to investments in individual nanotechnology companies,
  • Limiting such investments to companies in the U.S., Japan, Canada, Western Europe, and Australia, in the near term, and
  • Following developments in the nanotechnology initiatives of the BRIC block of emerging market economies without committing any funds until a clear assessment of the future prospects of individual investment opportunities becomes possible.

Individual investors must exercise an extra amount of caution in pursuing nanotech investments, and should not commit more than they can afford to lose. Most individual investors with a desire to invest in nanotechnology should do so through PXN and similar instruments. Institutional investors must bring all their resources to bear in assessing the viability of a nanotech investment strategy prior to committing funds to this nascent area. For added security, individual investors that seek to invest in publicly traded nanotech companies should seek firms with the following characteristics;

  • No debt, and positive cash flows, and evidence of an ability to sustain profits.
  • Companies that supply corporate customers must not be too reliant on one customer.
  • Founders and insiders should have a significant and increasing portion of their net worth at stake in the company, and a track record in multi-disciplinary research.

A Cautionary Tale

Nanosys called off plans to sell 6.25 Million shares to the public on August 4, 2004. Merrill Lynch led a team of investment banks in the I.P.O effort, which would have given the public a 29 percent stake in the company for shares priced between $15 and $17 each. Nanosys had hoped to raise $115 Million.

Investors may draw the following lessons from that occurrence;

  • Many nanotech companies face an up-hill task in converting promising research into products that can sustain a steady revenue stream.
  • A considerable number of nanotech companies may be surrounded by “more hype than substance”.
  • There is no guarantee that the price investors pay for an investment in nanotech will be adequate, once all associated risks are taken into account. For example, stock price volatility of nanotech companies may exhibit greater systematic tendencies than normal. Other publicly traded nanotech companies saw declines in their stock price when Nanosys cancelled its I.P.O in 2004; Nanogen fell 99 cents to $3.81, Nanophase Technologies fell 31 cents to $5.69, and Harris & Harris, a publicly traded investment group with a nanotech focus fell 85 cents to $8.53 – Harris & Harris owned 1.58 percent of Nanosys[xii].

A Note on Patents[xiii]

While it is true that there are serious delays in the processing of patents filed with the U.S. Patent Office, each patent that is filed is published automatically 18 months after the patent office receives the filing. This notifies the public that the process or product that is the subject of that filing has some probability of gaining patent protection. In legal parlance the owner of the patent application is said to have acquired “provisional rights”, and the term “patent pending” may be applied to the product or process. Companies and individuals may not launch lawsuits based on provisional rights. However, once the patent is approved the patent’s owner may sue for infringement of the patent going back to the date on which the patent came under “provisional rights” status. In essence, once an individual or company obtains provisional rights on a patent filing, they may begin to issue “cease-and-desist” warnings to competitors that may be attempting to appropriate the inventor’s intellectual property.

Uncertainty poses the real drawback from the delays in the U.S. Patent Office’s ability to process patent applications more quickly. While companies and individuals get provisional rights after 18 months, there is no guarantee that any given patent application will win eventual approval, or even that it will be reviewed within 4 years. Therefore, companies cannot rely on obtaining intellectual property rights to the invention. This may force them hedge their bets in some other fashion.

Many risks accompany investments in nanotechnology. However, if nanotech is to be believed, it may yield significant returns to those investors that learn to harness its power.

December 22, 2006


[i] For perspective, 100nm represents about 1000-1 of the width of a human hair.

[ii] Hassan, Mohamed H. A., Small Things and Big Changes in The Developing World. Science,Vol. 309 no. 5751, July 1 2005, accessed on December 19, 2006 at http://www.sciencemag.org/cgi/content/full/309/5731/65

[iii] The National Nanotechnology Initiative, Research and Development Leading To A Revolution in Technology and Industry, Supplement to The Presidents FY 2006 Budget

[iv]http://www.nanotechproject.org/index.php

[v] Bradbury, Danny, A Mini Revolution. The Independent (London) May 24, 2006.

[vi] Feder, Barnaby J., Doctors Use Nanotechnology To Improve Health Care. The New York Times, November 1, 2004. December 22, 2006.

[vii] Scientists Question Safety Of Nanotechnology, www.macnewsworld.com, November 20, 2006, accessed on December 22, 2006.

[viii] Markoff, John, Technologists Get A Warning And A Plea From One of Their Own, New York Times, March 13, 2000.

[ix] Van, Jon, Nanotechnology Hits A Patent Roadblock, Chicago Tribune, November 27, 2006.

[x] Gosh, Palash R, How To Invest In Nanotech, www.businessweek.com, April 17, 2006, accessed on December 22, 2006.

[xi] Adapted from www.luxresearchinc.com.

[xii] Feder, Barnaby J., Nanosys Calls Off Initial Public Offering, New York Times, August 5, 2004.

[xiii] Wesley T. McMichael provided the explanation that forms the basis for this section. Wes and I were classmates at Connecticut College. He majored in Physics, with a minor in Computer Science. He now practices patent law.

Barometer of Google – Google Buzz Has Struggled Since Launch

0

We continue our evaluations of Google products and major acquisitions and how they have fared against the competitions. The unveiling of Google+ positions Google to challenge Facebook on the social media turf. How that will turn out, no one knows. But one thing is known, Google has got great hits and also big misses, over the years. The barometer of Google presents Google Buzz today.

 

Opened to the public in 2010, Google Buzz brought some of the features of Twitter into Gmail, Google’s email system. Nothing revolutionary happened, except that the launch was plagued by privacy concerns. This was billed to somehow take traffic away from Twitter. But that did not happen.

 

Google Buzz is a social networking and messaging tool from Google that is integrated into the company’s web-based email program, Gmail. Users can share links, photos, videos, status messages and comments organized in “conversations” and visible in the user’s inbox.

 

So as we watch Google+ which some have termed Google social app 5.0 in its quest to dominate the internet as it has ruled over search, we will see how it plays out. But preliminary results show that Google+ could be the el dorado that Google has been waiting for. Millions have signed and many money are waiting. Facebook should be worried because as Tekedia noted, its valuations just got a hit.

Web Statistics Must Evolve From Number Of Visitors To Their Intensity And Engagement On Site

0

The structure of modern media business relies much on the size of audience to ascertain the advert potentials for firms. In the broadcast TV, the goal is to raise ratings measured by Nielsen in order to command more advert rates. This model is anchored on the construct that advertisers have to pay based on the number of viewers seeing their products. The CPT (cost per thousand) that measures the cost of reaching a thousand viewers is an industry standard.

 

Yes, a news aggregator gets a lot of bounce on the site and as soon as people arrive, they depart. But people spend hours on social media sites. And when it comes to advertising dollars, most people quote the number of visitors, despite the obvious skewed nature of that statistics. Is it the number of visitors of the visitor engagement or intensity that ad managers should focus on when determining where to put money?

 

Imitating the broadcast TV, the web business has a similar strategy that focuses on the number of people that visit a site. In other words, many web businesses have strategies that work to monetize the number of hits they receive on their sites. Any effort that gets customers to the web is a winning one. Advertisers will pay based on the numbers of site visitors and they can calibrate their investments with metrics similar to CPT.

 

Increasingly, we have seen that the structure of websites is designed to enable more hits because that is the most important metric that can be monetized. The web has followed the magazine, newspaper, cable and broadcast TV models despite the fact that it has evolved in many better ways that any of those industries.

 

While this model works, it is very deficient, especially for web business. Unfortunately, the use of CPT like model continues to affect how marketing directors allocate marketing dollars. From my perspective, while the size of a web audience matters, the most important thing is the intensity that website can create with potential advertized products.

 

This is important because audience intensity offers a more potential reward to ad dollar that web audience size. Understandably, you have to bring people to a site before you can access the intensity the site brings to them. However, it is safe to examine audience intensity within the context of niche-media strategy where advertisers spend money on sites where audience intensity is a higher metric than the size of the audience.

 

Let me illustrate my case using an example. Facebook is #1 website in United States in terms of the size of audience; more people visit the size daily than any other one. Naturally, it would be the most natural place to advertise. However, the demographics on Facebook do not make it the best place to advertise the release of a new Lincoln MKS.

 

The reason is that Facebook may not create enough intensity for a product like Lincoln MKS which is usually associated with people over the age of 50 years. They will see the ad, but few will connect to it.

 

The same argument can be made for different products and websites. What is important is the level of intensity the viewers and site visitors have and how that could match a product being advertised. This brings the need of measuring web intensity.

 

The Internet era needs new statistics that transcend beyond what the traditional broadcast TV has used for decades.  We must bring that element of ascertaining the engagement of site visitors.

 

We already see websites that use many distractions to get web hits. Unfortunately, that statistics does not reveal how connected the visitors are to the site. We need to push more and get that intensity data for the future of web business.

 

Closely related to this intensity is the need to ascertain how web contents engage their visitors. You can have a passive website like many news organizations where you just come to read news and without a section to make comment. Interestingly, many news organizations have tried to solve this problem by making sure that readers can post comments on contents. This engagement of visitors to post is very vital as it can help to access the level of intensity they really have on that site. An intense visitor has a higher chance of clicking an ad than a passive web visitor.

 

This makes a case why I will be ready to spend more ad money on a website where people play online video game that one where people just come passively to read headlines. The former brings intense visitors, while the latter may not be ideally intense.  You may not easily get them to start clicking ads easily. This makes it very interesting. It is possible that one group will be passive in one site and be intense in another one, making it more important to align ad products to site contents.

 

Quantitatively, I will prefer a website with 1000 daily visitors where the intensity of visitors is higher than one with 1500 visitors that have passive visitors. I can access that intensity by looking at how many clicks they performed on the site when they visited. The necessity of measuring how many clicks, not just time, a visitor makes on a site visit is very important to ascertain how likely that person can click an ad. That statistics is very important to know the attitude of site visitors and the essence of selling ads to that site.

 

Take an example; many websites suffer what I will call ‘content shocks’. During the election period, the hit rates of many politics focused sites go up, few weeks after election, they drop. The same thing applies to sports websites. Buying multi-year ads on such sites without accounting for those shocks will be irresponsible.

 

Personally, I have noticed that I rarely click many news contents in the morning since I have to get to work. However, in the evening, I do click those contents when I visit news sites. Understanding the attitude of customers depending on the time of the day will help marketing directors spend money wisely. You may focus on display ad in the morning since in most cases the visitors are not searching; they just want to read the headlines and get ready for work. In the evening, you can ask for text ad when they are back and can actually spend time on the web.

 

In conclusion, we need more statistics to turn web business into pure science. An introduction of metrics that go beyond size of visitors to site intensity is a necessary model to decouple web advertising from the frameworks that have been used in traditional media industry for decades. It will provide a more transparent vehicle to web advertisers and possibly help site designers focus on things that will help businesses advertise their products, efficiently.