Архив на категория: News

Leadership for the IT revolution

Leadership in some form or fashion is taught in every college and university on the planet and has been practiced in every organization that ever existed. Despite that omnipresence, as well as society’s fascination with leadership and ample journalistic treatment of what appears to be a perennial “leadership crisis,” many executives lack a framework to evaluate and improve their own leadership. “Good” and “bad” leadership remains for the most part a subjective, bordering-on-mood-based assessment.

For the past six months, I have been working with a group of early-, mid- and late-stage leaders to better understand the changing state of leadership. To get the ball rolling, stretch the mind and precipitate animated conversation, I asked this group of IT leaders if the traits that made Alexander “great” were still relevant today. They concluded that leadership has evolved significantly in the 2,400 years since the boy king conquered most of the known western world, with contemporary leaders perceived as being more community-focused. 

End of story? Far from it. The tension between the two extremes of leadership style has been studied for millennia. As Emma Dench, the McLean professor of ancient and modern history and of the classics at Harvard University who co-teaches a popular elective course at Harvard Business School called “All Roads Lead to Rome: Leadership Lessons from Antiquity,” explains, “The Romans grappled actively with a very central issue of leadership: How much is a leader for themselves — or how much are they for the people as a whole. … ‘Is it just you on an island, or are you part of a community?’ ”  

In recent times, community has been ascendant — but not universal. The tension remains. In 1991, Joseph Rost, professor emeritus of leadership studies in the School of Education at the University of San Diego, researched the state of leadership, examining 450 books, chapters and journal articles. His research documented more than 200 different and not always consistent ways the leadership industry defines “leadership.” The leadership industry cannot seem to make up its mind whether leadership involves going ahead of others, facilitating others to run, showing others how they should run, motivating others to run, designing the trail the runners run on, timing the run or giving prizes to the fastest runners. 

Source Article from http://www.computerworld.com/article/3195157/it-management/leadership-for-the-it-revolution.html

The human side of the data revolution

For over a decade, data has been at or near the top of the enterprise agenda. A robust ecosystem has emerged around all aspects of data (collection, management, storage, exploitation and disposition). And yet in my discussions with Global 2000 executives, I find many are dissatisfied with their data investments and capabilities. This is not a technology problem. This is not a technique problem. This is a people problem. 

Those enamored of data often want to eliminate the human from the equation, but it can’t be done. And so, as climate science considers the impact of man on the environment, data science must wrestle with the inverse: the impact of data on man. 

You can fill a library with books talking about the data revolution. There’s Viktor Mayer-Schönberger’s Big Data: A Revolution That Will Transform How We Live, Work, and Think; Steve Lohr’s Data-ism: Inside the Big Data Revolution; Malcolm Frank, Paul Roehrig and Ben Pring’s Code Halos; Bruce Schneier’s Data and Goliath: The Hidden Battles to Collect Your Data and Control Your World; Christian Rudder’s Dataclysm: Who We Are (When We Think No One’s Looking); Andreas Weigend’s Data for the People: How to Make Our Post-Privacy Economy Work for You; and my very own The New Know: Innovation Powered by Analytics. 

These are fine works of nonfiction focusing on the potential and perils of a rapidly informating world. “Informating” is a term coined by Shoshana Zuboff in her book In the Age of the Smart Machine (1988). It is the process that translates descriptions and measurements of activities, events and objects into data/information. “Datafication” is a synonym for “informating” — the trend associated with turning many aspects of modern life into machine-readable data and transforming this information into new forms of value.  

Source Article from http://www.computerworld.com/article/3190208/it-management/the-human-side-of-the-data-revolution.html

Preparing for the 15-year future

We are transiting a moment of massive uncertainty. The ambiguous road ahead requires making a prediction about predictions. I predict that in 15 years, 85% of the predictions, forecasts and trends set forth by the academics, analysts, futurists and economists (maybe not the economists — their accuracy is increasingly suspect) will influence much of what happens day to day in business.

For IT executives, accelerating change will require them to constantly ask themselves, “What is the right problem to be working on today?” If the answer is not what they were working on yesterday, so be it; they must adjust and move on.

To best answer that recurring question and assure that energies and resources are focused appropriately, they need to answer some subordinate questions. 

With whom should I be talking?

One of the most important lessons my mother took from her years working in the intelligence community is this: Your network will keep you safe. Moving forward, IT executives need to cultivate powerful personal and professional networks. 

Source Article from http://www.computerworld.com/article/3176855/it-management/preparing-for-the-15-year-future.html

Data realities of 2017 and beyond

Data matters. You don’t need to be a professionally trained futurist to see that a critical driving force in the global economy for the next 10 years will be data exploitation (creation of value with data), data sharing and data protection. (Let that data Exploitation, Sharing and Protection be your ESP.) Executives need to examine what they know and how they think about data. 

That’s because data matters, now more than ever. The recent CES in Las Vegas drove this home (even for those of us who only read about it rather than suffering the crush of humanity). If you hadn’t accepted the reality of the internet of things (IoT), you couldn’t escape it at CES 2017. When the IoT becomes dominant, every object will be collecting and sharing data. Smart beds will be sharing data with smart thermostats, smart refrigerators will be sharing data with smart product packaging, and smart hairbrushes will be sharing data with human hair brushers. And everything will be a source of data. Proof, a prototype wearable from Milo Sensors, measures blood alcohol level through your skin. Aspiring baseball superstars can use a virtual batting cage in which they practice against a database of every pitch ever thrown by a particular pitcher.

The nature of relationships will change. Everything will have a relationship with everything — objects with other objects, objects with their manufacturers, objects with their human users. The robots showcased at CES 2017, such as Mayfield Robotics’ Kuri and LG’s Hub Robot, were notable not so much for what they could do as for the relationship their interface enabled — their personality. The Yui interface of Toyota’s “Concept-i” concept car collects data that apparently can tell when the driver is happy or sad and adjust the mood inside the car accordingly. The NeuV concept vehicle puts Honda on a path that will “enable machines to artificially generate their own emotions,” according to the company.

Data is power

For 420 years, no one has challenged Sir Francis Bacon’s formulation that knowledge is power. But in 2017 and beyond, data is power. The differences run deep. Knowledge is gained through study and experience. Data is merely collected, and in such a quantity that no human mind can begin to contain it. 

Source Article from http://www.computerworld.com/article/3165293/it-management/data-realities-of-2017-and-beyond.html

IT and the forever revolution

In this still young century, the IT industry has become obsessed with transformation and disruption. These two terms are little more than new labels for a centuries-old phenomenon that normal humans refer to as “revolution.” IT is all about revolution. It may seem paradoxical, but moving forward, organizations need to add revolutionary thinking and revolutionary behavior to standard operating procedure. In the future, to be sustainably successful, IT executives will have to become revolutionaries — at least part of the time.

Working with the Rady School of Management at the University of California-San Diego, the College of Engineering at Ohio State University and the trade and professional organization AIIM, I recently launched a research program focused on revolution in the IT ecosystem. 

When I began my research, several colleagues initially responded, “Isn’t this just more marketing hype?” Certainly, attention-grabbing marketers and seed capital–craving startups love to label every new micro development in the IT space as “revolutionary,” and the word peppers click-bait headlines touting “the big data revolution,” “the IoT revolution,” “the mobility revolution,” “the algorithm revolution,” “the DevOps revolution,” “the machine learning revolution” and “the connectivity revolution.” If everything in IT is revolutionary, is anything in IT revolutionary? Have we overused the word to the point of rendering it meaningless? 

It is my intention to wrest the term from the mitts of selfish and small-minded attention merchants and vulture capitalists and return it to where it belongs — into the arms of operating IT executives. 

Source Article from http://www.computerworld.com/article/3158216/it-management/it-and-the-forever-revolution.html

Leaders as communicators

It is remarkable how many leaders seem to overlook the undeniable correlation between mastering communication and successfully occupying a position of leadership.

The evidence has been piling up for centuries. Just on this continent, the Founding Fathers, in the late 18th century, were masters of the printed word via such documents as the Declaration of Independence and the Federalist Papers. In the 1930s and 1940s Franklin Delano Roosevelt owned radio with his Fireside Chats. Presidents Kennedy and Reagan excelled at television. And the president-elect, Donald Trump, may owe his position to his ability to parlay a communications trifecta: the political rally, the reality TV show and social media via Twitter.

I recently asked a group of CXOs their thoughts about the communication skills they thought would be required by the leaders of the future. 

Top-down communication is so last century

One key to effective communication is to focus on the most current modes. The political rally is an ancient form, but it’s still relevant, and Trump has been a social media pioneer. Meanwhile, the campaign of his opponent from the Democratic Party, Hillary Clinton, relied heavily on email. There’s reason to believe she might as well have tried to reach people by circulating clay tablets.Similarly, there are CEOs in the world today who think that, once a strategy has been synthesized, they can use traditional information channels such as email to turn the ball over to underlings for execution. That’s old thinking. 

Source Article from http://www.computerworld.com/article/3149533/it-management/leaders-as-communicators.html

2017 will be a bad year for pessimists

I think that we can all agree that the 2016 presidential election cycle was brutal, an overwhelmingly negative attack on the attention spans of the global population. Call me a delusional optimist, but I forecast that despite all the sturm und drang of 2016, 2017 is going to be a bad year for pessimists.  

One reason that I conclude that things are going to improve is that society may have passed peak negativity in 2016. Peak oil, says Wikipedia, is “that point in time when the maximum rate of extraction of petroleum is reached, after which it is expected to enter terminal decline.” And, at least in the circles I run in, we are running out of negativity. More precisely — since even a chronic optimist will admit that we may never totally run out of negativity — we are suffering from negativity fatigue. 

I spent most of 2016 talking to executives with the word “chief” in their titles about what to expect in the near future. They pretty much agreed with Franklin Roosevelt, who said in his first inaugural address that “only a foolish optimist can deny the dark realities of the moment.” The inhabitants of the C suite tend to be realists, and they recognize that there are a lot of things that need fixing. But they are also confident that we have the tools to do that. Here are some of the things that these executives are optimistic about for 2017. 

The end of demographic stereotyping

Millennials will come into their own in 2017. For the past decade, millennials have been on the minds of senior executives in every vertical market and discipline. We’ve seen a cottage industry set up to explain how millennials aren’t like anyone else, but the commentators often seem intent on stoking baby boomer generational anxiety. The truest observation is probably that what makes millennials unique is that no other demographic cohort has inspired so much misinformation. In fact, millennials are no less understandable, manageable and leadable than any other cohort in the history of man.  

Source Article from http://www.computerworld.com/article/3137450/it-management/2017-will-be-a-bad-year-for-pessimists.html

9 big ideas IT shouldn’t ignore

We live in an age filled with big ideas but devoid of any real consensus regarding how to prioritize them. Working with colleagues, I have attempted to systematically think about the big ideas that surround us: what they are, why they matter, the progress being made in adopting them and the obstacles preventing any particular big idea from moving to the next level. 

The big idea behind this research is that if we understood the world’s big ideas, we would be able to craft a better future. 

The 19th-century French novelist Victor Hugo wrote, “One resists the invasion of armies; one does not resist the invasion of ideas.” But apparently the invasion can be ignored. The first cognitive gut-punch I received in my research was that, in many organizations, the invasion of big ideas appears to have stalled.

Too often, big ideas exist in press releases, CEO public statements and marketing campaigns but are not part of the operational agenda. Why are they, in many instances, divorced from what happens day to day? Why do organizations that spend a lot of time researching and polishing what they say spend so little time making sure that what they say and what they do are synchronized?

Source Article from http://www.computerworld.com/article/3131833/it-management/9-big-ideas-it-shouldnt-ignore.html

The digital dissidents

Most businesses today are thinking about digital disruption. Either they are trying to unleash it, or they are seeking ways to avoid being victims of it, rendered irrelevant. Some are involved in both activities.

Most of their employees are on board with all of this. Research that I have been involved in at Ohio State University and the University of California, San Diego, shows that in organizations seeking to disrupt or prevent being disrupted, 89% to 97% of the workforce backs the disrupt/prevent disruption program. 

Those are impressive figures. They suggest that just 3% to 11% of the employee population is not fully supportive of digital disruption initiatives. Call them the digital dissidents. But don’t let those low numbers mislead you. In today’s workplace, the dissidents can carry a disproportionate amount of power — and, ironically, it is to a large extent the digital revolution that has empowered them.

In fact, once you think about the implications of that super-empowerment, you realize that 3% is a very large number. New York Times columnist Thomas Friedman talks about that in his soon-to-be-released book, Thank You for Being Late: An Optimist’s Guide to Thriving in the Age of Accelerations. We have to come to terms with what he calls the “Power of One.” If you think that is an exaggeration, read up on what a low-level contractor named Edward Snowden accomplished at the National Security Agency. Not long ago, it would have taken at the very least the director of the NSA to cause an equal amount of disruption. 

Source Article from http://www.computerworld.com/article/3118720/it-management/the-digital-dissidents.html

The world of yesterday, today and tomorrow

Eighty-two years ago this month, The New York Times published an article commenting upon the 20-year “Grand Canyon of History” separating the leaders of 1914 from the then-contemporary leaders of 1934. The general question was how leaders of one era differ from those of another. That interesting exercise gave rise in my mind to a two-part variation: How did the leaders of 20 years ago differ from those of today, and how will the leaders of two decades from now be different from those who lead today? 

How different was 1996?

Entering the Wayback Machine (also known as the WABAC Machine. If you’re under 40, you may need a Wayback Machine of your own to understand the reference, from The Rocky and Bullwinkle Show. A serviceable substitute is YouTube.), we see that in 1996 we are just entering the internet era. The global population is around 5.7 billion people. Only 45 million of them are using the internet, with roughly “30 million of those in North America (United States and Canada), 9 million in Europe, and 6 million in Asia/Pacific (Australia, Japan, etc.). 43.2 million (44%) of US households own a personal computer, and 14 million of them are online.”

It’s very different age technologicallyEarly in the year, Motorola introduces the Motorola StarTAC Wearable Cellular Telephone, the world’s smallest and lightest mobile phone to date. Chess computer Deep Blue defeats world chess champion Garry Kasparov for the first time. Pokémon Red and Blue are released in Japan by Nintendo. At the end of the year, Steve Jobs’ company NeXT is purchased by Apple Computer. 

Most people in 1996 don’t realize it, but the trends that will dominate two decades hence (such as mobile, machine learning/A.I., gamification and consumerization) are just kicking off. 

Source Article from http://www.computerworld.com/article/3110630/it-management/the-world-of-yesterday-today-and-tomorrow.html