In case you haven’t noticed I haven’t been very faithful with posts here lately. That’s because my wife and I enjoyed a nearly three week road trip. We drove the country coast to coast from our home north of Seattle to Charleston and Orlando for speaking engagements, then back home again. We saw 23 states and weather ranging from snow to blistering heat to a dark, scary thunderstorm in San Antonio. We survived an fire alarm in a hotel that had us standing on the street for an hour surrounded by fire trucks and survived a scary near-accident. We drove in comfort and economy enjoying our new Lexus CT200h with its 42 mpg zippyness.
Home again and time to turn thoughts to what is going on in the world of crisis communication and PR. There is much of course. Still a lot about Bin Laden, about social media in emergencies particularly in Japan, and then there is the Facebook, Google and Burson Marsteller fiasco.
First, having met and listened to Mr. Burson give a presentation a couple of years ago I really feel for him. It would be a shame if his reputation earned by a lifetime of stellar service and modeling of public relations built on integrity would be sullied by this event. Somehow I can’t help feeling there is more to this story than what we are seeing.
Clearly the fact that they were unwilling to divulge their client is a serious ethical problem. It’s hard for me to understand how two apparently savvy professionals thought they could manipulate coverage while hiding the company paying them. What else is hard for me to figure out in this is that Facebook fired them. Sure, like Chrysler its the safe way of distancing yourself from a contractor when the contractor screws up. However, as this Wired article points out, Facebook cannot come out of this looking like the victim. My question is issues like privacy and security are pretty technical and the “information” about Google’s supposed privacy problems had to come from Facebook. So the two PR pros end up looking duped by Facebook into thinking there was a problem there when apparently there really wasn’t. So, the Burson Marsteller staffers look to be the victims of Facebook manipulation rather than the perpetrators.
Regardless of what lies behind this sordid affair, the lessons are too obvious. It comes back down to the basic issues of transparency and honesty. If what you are doing cannot stand the full light of day, then you better ask yourself what your life will look like when it does come into the full light of day. I kind of hate to think that the fear of getting caught is a motivator for right and ethical behavior but I’d rather it be that way than to rely entirely on the moral character. Somehow, that seems to keep failing us.
But, I do think there is a deeper issue here. Apple has recently come under attack for storing users location information on iphone and ipads. When asked for a comment about this I noted that technology providers today face a bit of a dilemma related to using data generated by their customers. On the one hand, all that information provides a basis for some of the most powerful technologies–technologies that we benefit from and are essential in winning the high stakes innovation game. But often those advances depend on mining the ever increasing stream of data that is being created.
I benefited greatly from the navigation system in my CT200h including the warnings that would frequently come up about heavy traffic on my route or an accident up ahead creating stop and go traffic. How do they know that stuff? How can systems know traffic status on essentially all major streets and freeways across the nation? In this case, they must be tapping into data sources provided by state’s departments of transportation. When you start thinking of all the possible uses for the data being generated by the billions of people using smartphones and pad computers it is truly mind-boggling. This issue of collecting, mining, and applying that data for useful purposes will not quickly go away.
Prediction–there will be many more battles about privacy, security and application of user generated information to come. It’s a tricky road for technology providers and ultimately users will have to decide how much they are willing to share and what they will give up by limiting access.