Start Dumbing Down Your Website

Google’s long talked-about “mobile first” strategy began rolling out in earnest earlier this year. Here is a quote from the Google Webmaster Central Blog of Monday, March 26, 2018:

“To recap, our crawling, indexing, and ranking systems have typically used the desktop version of a page’s content, which may cause issues for mobile searchers when that version is vastly different from the mobile version. Mobile-first indexing means that we’ll use the mobile version of the page for indexing and ranking, to better help our — primarily mobile — users find what they’re looking for.”

The “primarily mobile” users referred to are the 56% of website visits from mobile devices as opposed to the 44% from desktop computers (study). How the breakdown between desktop and mobile is determined from the far end is somewhat hazy, especially with smaller screen laptops and “netbooks”. Nevertheless, with more than half of website visits now originating from devices other than desktop computers with large displays, Google has decided that how your site performs on what are referred to as “mobile devices” will ultimately determine your site’s ranking.

CSS Is Not Enough

Many years ago, when all one had to worry about was whether an end user had a newer, larger monitor connected to their desktop computer you could adjust the look of your site with a bit of JavaScript:

Pretty clunky by today’s standards, but adequate 15 years ago.

Newer versions of HTML and CSS introduced media queries to address this issue:

These workarounds, old and newer, try to adjust the look of your site depending on the screen size on which it is being viewed. To really make your site viewable across all devices it must be “responsive”, that is it must respond to changes in screen size and even orientation.

To attempt a responsive design usually means a complete site re-do. Early websites were considered amazing if they had some images and iframes. Then tables came into vogue for content layout. When web developers admitted that tables were meant for tabular data, not layout purposes it was all about div’s. Now grid layouts and columns are being used. It is not just the arrangement of the page elements that needs to be considered. Fonts and images need to be resized. Font colours and background colours need to be changed to higher contrast for readability in bright light conditions. Navigation items such as menus and links need to work on a touch screen.

Most people, even those with development experience, throw up their hands and use a ready-made template.

A Separate Mobile Site Is Apparently Not the Answer

When smartphones were still in their infancy I taught a course on programming pocket PC’s. You may remember pocket PC’s as those devices with a delicate screen and a scratchy stylus. A common tactic back then if a user accessed your Internet site on their small-screen device was to redirect the user to an entirely different mobile “site” in a sub-folder of your main site when the small screen was detected. The mobile site was a stripped down version of the “real” site. The mobile site had less text and smaller graphics and was really just a way to keep a user engaged long enough for them to consider visiting your real site from a real computer.

According to Search Engine Journal: “If you have a site configuration where the primary content and markup is different across mobile and desktop, you should consider making some changes to your site.”

Ironically, Google’s own “Basic principles” listed on their “Quality guidelines” article include the following:

  • Make pages primarily for users, not for search engines.
  • Avoid tricks intended to improve search engine rankings.

The constantly moving target that is Google Search Engine Optimization has spawned an entire industry devoted to promising top-of-the-first-results-page for less than the cost of Google AdWords. Now it is also padding the retirement accounts of web developers able to deliver responsive design sites. This is not necessarily a bad thing – after all, I do web development myself. What does concern me is that “quality” thing.

What Are Users Doing With Their Phones?

Back in college, one of my professors said that computers were mostly used for “LUS” (Looking Up Shit). The same can be said for mobile devices. Phones are used to get directions, check store hours, scan QR codes, check the weather forecast, and a host of other simple actions that streamline the user’s day. Few users are going to stumble across a copy of “War and Peace” online and start reading it on their iPhone.

SEO guru Neil Patel offers some friendly advice for optimizing your site for mobile devices:

“…reduce some of the burdens on the reader.
A 3,000-word blog post looks great on a desktop. But it can be intimidating on a mobile device.
Using short paragraphs to break up content will help…”

Yes, the web is becoming “Twitterized”. The one-liner format is taking over. Resistance, it would appear, is futile.

I am currently developing and maintaining a website for a local museum. The site is not “mobile-friendly” and will not be any time soon. The site contains lots of images, slideshows, and long articles that are just not going to work on a smartphone. We have made the decision to go with a separate mobile site that will contain quick lookup items such as hours, directions, upcoming event listings, self-guided walking tours and QR code lookups of information about local buildings. This decision will no doubt affect Google’s ranking of the museum’s site negatively.

The Return of the “Brochure Site”

When the Internet first exploded it was considered essential to have a site — any site — to promote your organization. Comparisons were often made to having a Yellow Pages listing.

As sites became more complex and included database connectivity, the much-touted educational potential of the Internet looked like it was becoming a reality. It would seem that the priority being given to small-screen mobile devices is turning back the clock.

Find other ways to bring traffic to your site or start dumbing it down.

Why are meetings so boring?

It’s difficult to find fans of business meetings anywhere. On the contrary, we see articles and blog posts with titles like “Why meetings are a waste of time” and sarcastic “get out of meetings free” cards. Some remedies for the seemingly endless/unnecessary meeting include no meetings at all, “fun” meetings with ice-breakers and games, shorter and/or virtual meetings, and specific meeting rules of order like strict agendas and “why am I talking” flow charts.
In her book Storytelling With Data: A Data Visualization Guide for Business Professionals, author Cole Nussbaumer Knaflic takes the usual easy shots at PowerPoint but argues that meetings and their requisite slideshows are often boring and ineffective because the data that they are meant to communicate are often presented in such a confusing, unintelligible manner that viewers just give up trying to understand them — what Knaflic calls the “ugh” moment.

Assuming that attending a business meeting is eventually inevitable, many observers have instead searched for the reason for that sinking-in-quicksand feeling that the typical meeting presentation can evoke. A frequent suspect is that “deck” that your well-meaning co-worker has toiled over, complete with jarring colours, theme music and cartoon-like slide transitions and animations. And yes, not wanting to be accused of “not playing with a full deck” it always seems to contain at least 52 slides.

Author Knaflic’s book aims to help change all that. She is a math wunderkind with a degree in applied math and an MBA. She has worked as an analyst on Wall Street and for the Gates Foundation, as part of Google’s People Operations team and as a professor at the Maryland Institute College of Art. Now also as an author, speaker and trainer she specializes in helping people make data understandable.

The book focusses on the presentation of numerical data in tables, charts and graphs. It is arranged in ten chapters, each containing a “lesson” with before and after examples. Everything suggested can be accomplished by users without design training using common tools like Excel. Drawing on research in cognitive science, design and psychology, the author advocates for a “less is more” approach of clean, uncluttered, visually pleasing data presentations that communicate with the viewer clearly and quickly.

“Studies have shown that we have about 3-8 seconds with our audience, during which time they decide whether to continue to look at what we’ve put in front of them or direct their attention to something else.”

Some of Knaflic’s maxims:

  • Pie charts are evil
  • Don’t use 3D
  • Clutter is your enemy
  • Use colour sparingly
  • Highlight the important stuff

Anyone who has to do a business meeting presentation should read this book. Once you have read it and put its lessons to use you will wish that everyone else had. Let’s put an end to boring meetings!

Big Data, Black Boxes and WMDs

As someone who worked in the database field for many years and taught database theory and programming at the college level, I was initially enthusiastic about the Big Data phenomenon. I am still a proponent but have developed some concerns about how Big Data is being used and misused. Cathy O’Neil’s book Weapons of Math Destruction: How Big Data Increases Inequality and Threatens Democracy could not have come at a better time.
WMD cover

O’Neil is a data scientist and author who earned a PhD in mathematics with a thesis on algebraic number theory. She blogs about mathematics and politics on her site: She went from teaching at Barnard College and Columbia University to a job as a “quant” for a Wall Street hedge fund just a year before the 2008 meltdown. Her book provides an inside look at the computer algorithms that not only run Wall Street but that increasingly run and/or ruin our lives. Software — designed to analyze huge amounts of data and then spit out answers about everything from jail sentences to college rankings and recruitment, policing, hiring, employee management, teacher ratings, credit-worthiness, insurance, advertising and political polls — is analyzed and dissected.

Originally designed to remove human error, improve efficiency and cut costs, the algorithms built into proprietary software have taken on a life of their own. As O’Neil demonstrates again and again, those algorithms are often faulty but can still deny people a job, deny people a loan and deny them insurance, trapping them in a cycle of poverty. The problem is that those algorithms are secret and cannot be questioned.

The Black Box

When I first started teaching programming most development was focussed on desktop applications. The goal was to create a compiled executable program that could be sold for installation to the hard drive of a desktop computer. The compiled executable and the EULA warnings about reverse engineering the code within were used to protect that code from unscrupulous rival software developers. Students were taught to think of and design their applications as a series of “black boxes”. Instead of a giant, hard to maintain, monolithic block of code, the black boxes were code modules designed to perform a specific function and to hopefully be reusable within the application. They were a means of breaking the program down into more manageable pieces that could be created by individual members of a team of developers, tested and then left alone. Specific inputs would produce expected outputs and the functionality used to produce those outputs was purposely obfuscated. All another developer on the project needed to know was to “wire up” one of these black boxes of code and then “call” for its output.
A common problem with this approach to programming would come to light when the original application came to the end of its life cycle and was being replaced by a newer application. The new application was tested by users and reported to be faulty. Developers went to work testing and re-testing and could find nothing wrong. It was often the case that in the end it was discovered that the original application had been wrong all along but had been accepted as correct for so long by users that they trusted its results and doubted the new, correct application. Something was amiss inside one of the old application’s “black boxes” but it was difficult or impossible to see inside.

When I was working as a data analyst for a financial institution we were converting a large block of management reports created using Crystal Reports to SQL Server Reporting Services. I ran into a problem with one of the managers in the Loans Department. She insisted that one of the new reports I had created for her was incorrect. The report was counting and totalling information about student loans. Loan officers in the branch offices had to manually enter the word “Student” in a LoanType field on the loan application screen. In the old report, the SQL code was not in a black box so it was plainly visible as WHERE LoanType=”Student”. When testing I noticed a large number of variations entered in the LoanType field: “Student loan”, “School loan”, ” Student”, “Student “, “Stdent”, etc. I modified the SQL query for the new report to account for these anomalies, so the totals for the faulty old and the corrected new reports did not match. It took a couple of heated meetings with me, the loan manager and my manager to get the loan manager to grudgingly accept the idea that the data she had been relying on for years had been faulty all along.

Big Data, Big Mistakes

In O’Neil’s book she identifies similar problems. Software used by police departments, insurance companies and financial institutions can be faulty but there is no way to discover the faults so users go on believing the results. The software code is a black box.

Even if the code itself is good, the premise behind the software may be to blame. The software used by police departments deploys more police in areas that Big Data tells them are high crime areas. More arrests are made in those areas so the next round of data shows that even more policing is needed there and seems to prove the original premise of the software. These areas are probably populated by poor minorities that become victims of the numbers game. People with poor credit — sometimes incorrectly reported — have their job applications rejected by HR software so they fall even deeper into financial hell. Poor working class people living in a “bad” postal code pay higher rates for the car insurance they need to drive to their low wage jobs, again perpetuating a cycle of poverty.

The book goes into a number of examples in detail.

Because algorithms…

O’Neil is not against the use of Big Data and is not a Luddite about programming. She is a data scientist remember. What she warns about — and I have to agree with her — is the misuse of data and a profit-centric model for software that puts corporate interests ahead of human interests. I also feel that we need programmers and start-ups to throttle back the hubris a bit and software vendors to employ less snake oil. The 21st century may be an age of data but it also needs to be a human age.

Raspberry Pi Web Server


If you do any serious web development using PHP and MySQL, you will eventually realize the need for a test server on your local network. For years I used a re-purposed PC running Ubuntu Server. I would code the pages using Visual Studio running on my Windows workstation and then upload them to my test server. I used MySQL Workbench to connect to the MySQL database running on the server. The old PC finally died and I was casting around for alternatives when I happened upon the chapter “The Pi as a Web Server” in the Raspberry Pi User Guide by Eben Upton and Gareth Halfacree.

If you are not familiar with the Raspberry Pi, briefly it is a tiny computer on a single circuit board that was developed in the UK as an educational tool to make learning about computers and programming affordable for schools, hobbyists and the “Maker” community.

Getting the hardware

Raspberry Pi’s are readily available on Amazon and eBay. I bought a Raspberry Pi 2, Model B packaged with a plastic case and a power supply for around $50 Canadian. This particular model has a quad-core 900 MHz processor, 1 GB of RAM, a full HDMI port, an Ethernet port and 4 USB ports. The other connectors on the board will not be used for this application. The power supply connects via a micro-USB plug much like a smart phone.

The Pi uses a micro-SD card to store the operating system, software and files. A minimum capacity of 4 GB is recommended. I purchased a blank 32 GB micro-SD card along with an SD adapter to connect to my PC.

To get up and running you will need a USB keyboard and mouse and an HDMI cable to connect to an HDMI port on a TV or an HDMI to DVI adapter cable to connect to a monitor. VGA is not supported, although you can buy an HDMI to VGA adapter, they are quite expensive. I bought an HDMI to DVI cable online for under $10. You will also need a network cable to connect to your router.

Installing the software

Some Raspberry Pi’s are sold bundled with the NOOBS installer utility preloaded on a micro-SD card. If you did not purchase a micro-SD card with the installer software preloaded, you will need to install the operating system on the micro-SD card. The card can be flashed, but the easiest method is to use the NOOBS installer utility. It is a free download. If you have a blank card you will have to format it first using SD Formatter, also a free download. Once your card is ready, the NOOBS utility can be unzipped and copied to it.

Insert your prepared micro-SD card into the port on the board. Connect your USB keyboard and mouse. Connect your TV or monitor to the HDMI port. Connect your Pi to your router with the network cable. During setup, the Pi will connect to the Internet to download updates. Later on you will need to set a static IP address for the Pi in its role as a web server, but for now it is best to make sure that DHCP is enabled on your router so the Pi can grab an IP address for immediate use. With all connections made, plug in the power supply and the Pi will start up.

The NOOBS setup utility will display on your screen offering you a choice of operating systems to install. For use as a web test server, Raspbian is best. This is a version of Debian Linux modified for the Raspberry Pi. The Pi will restart to the Raspbian desktop after installation of the operating system is complete. For use as a web test server a GUI consumes resources unnecessarily. It is best to change settings so that the Pi boots up to a command line. The rest of the setup is done from there. Raspbian’s command line uses the “apt-get” tool you will be familiar with if you are a Debian or Ubuntu user. If you need a tutorial on using the Linux command line and apt-get there are plenty available online.

Use apt-get to install Apache web server, MySQL and PHP. They will be downloaded automatically. Make sure to record your username and password for MySQL. You will need them later to connect to MySQL remotely via a GUI tool running on your workstation. There are plenty of these tools available as free downloads. At this point, after using ifconfig to find out the temporary IP address assigned to the Pi by DHCP, you should be able to view the Apache and PHP confirmation pages from a browser over the network, indicating that your web server is up and running.

Final steps

Your web server needs a host name and a fixed IP address. You will also have to set the time zone and locale. These tasks can be done from the command line using the built-in raspi-config utility and nano text editor. I turned off SSH as a security measure and since I did not plan to run commands remotely on the Pi.

The most time-consuming step for me was setting up a way to see files on the Pi over the network from my Windows workstation. I used apt-get to install Samba, which allows the creation of a “share” – a folder on the Pi that can be mapped as a network drive on your workstation. There are a few steps here and I went through a lot of trial and quite a bit of error. I created a Linux user matching my Windows username and password. I then created a Samba user with the same name and password. I gave the Linux user ownership of the /var/www/html folder where the Apache server keeps the web files. An smb.conf file was created using the nano text editor. This configuration file defines the Samba “share” that you want to access, in this case the /var/www/html folder. If all works correctly, when you try to access the mapped drive over the network, the credentials you used to log into Windows will be passed to the Pi, first as a Samba user and then as a Linux user and you will be able to add and delete files from the shared folder.

Once all the setup is complete the keyboard, mouse and monitor can be disconnected. The Pi draws so little power that I just leave it running. With some work and minimal investment I now have a simple, reliable test server with a tiny physical and power consumption footprint.


Windows 10 Upgrade Experiences

Despite being a satisfied Windows 7 user for a while, I bit and decided to take advantage of Microsoft’s offer of a free upgrade to Windows 10. Although I only upgraded two machines, my experiences may be useful to others contemplating the switch. Of course this does not apply to corporate fleets of PCs where a re-imaging process will no doubt be used. The machines I upgraded were a work machine for my home-based consulting business and a machine used for home entertainment.

Machine #1: workstation

The first machine on my network to show up as “ready to upgrade” to Windows 10 was an HP Z400 Workstation with a 2.53GHz Intel Xeon processor, 4 GB of RAM, running the Windows 7 Professional 64-bit operating system. This machine was purchased refurbished from a large, reputable retailer of computers, computer parts and accessories. It came with Windows 7 Professional 64-bit already installed and aside from a slower, cheaper hard drive hardware was all original. It had been running flawlessly for just over a year in its role of everyday office computer for tasks such as email, website maintenance and application development.

The Windows 10 upgrade for this machine was not quite the one-click process described in Microsoft’s marketing copy, but was not all that difficult. Despite almost always leaving the computer on overnight to run automatic updates and virus scans, the Windows 10 upgrade process stopped partway through and informed me that the machine could not be upgraded at this time. After searching a few sources online, I followed a suggestion to run Windows updates manually and try again, which I did – a couple of times eventually. It does, of course, make sense that all updates should be in place before proceeding, but the Windows 10 upgrade process did not identify this issue ahead of time, but rather stopped partway through.

Once it was finally underway for real, the upgrade process took about an hour with multiple restarts. I followed the on-screen suggestion to relax and do something else while the process ran. After the upgrade, all software that was installed before ran fine, although I had to immediately update Kaspersky Internet Security and restart yet again. The Settings screen shows the operating system as Windows 10 Pro.

Machine #2: entertainment system

The second machine to show up as “ready to upgrade” to Windows 10 was a small form factor HP Compac dc5800 with a 2.33 GHz Intel Core Duo processor, 4 GB of RAM, running the Windows 7 Home Premium 32-bit operating system. This machine was also purchased refurbished from the same retailer as the workstation and with the operating system already installed on the original 80 GB HDD. To suit its role as an entertainment machine I had added more RAM, a mid-quality PCI-Express video card with HDMI output, a Hauppauge tuner card, a second 320 GB HDD for storing recorded video and a Logitech wireless touch keyboard. This machine stays permanently connected to our TV and although it is obviously not up to gaming, it functions perfectly well for watching streaming content and cable TV programming.

The Windows 10 upgrade process for this machine was far from straight-forward, despite my attempt at learning from my first upgrade experience and running all updates manually first. After starting the upgrade process and watching it run for a few minutes while it downloaded the necessary files, the process stopped abruptly with the message “windows 10 cannot update system reserved partition”. Another try returned the same result. This sounded like a permissions problem to me at first, which puzzled me since I am administrator on the machine. After searching online I learned it was not a permissions problem but a space problem. Sure enough, Disk Management showed that my system reserved partition was less than 100 MB. Not wanting (actually too lazy) to go the command-line route, I followed the directions at this link and used the method suggested by blog post author Buck Hodges and downloaded and installed MiniTool Partition Wizard Free. The tool’s interface is not the most intuitive and at times had me thinking about going the command line route after all, but I eventually successfully enlarged my system reserved partition. Hodges enlarged his system reserved partition to 300MB. Since I wasn’t concerned about disk space on this machine, I enlarged mine to 650 MB just in case. (Post upgrade, the system reserved partition shows 266 MB of free space).

Full of confidence now, I started the upgrade process again and it got past the previous error message. I walked away and came back an hour later expecting to find the process finished. Instead, it had stopped with a dialog box asking me to acknowledge that I understood that Windows Media Center was not going to be installed. I remembered reading about Windows 10 no longer including Windows Media Center, but I had assumed that since it was already installed it would simply carry over like my other software. It was not to be, but at this point I was not in the mood to turn back and accepted. (More about this later.) When the upgrade on this machine finished, the Settings screen showed the operating system as 32-bit Windows 10 Home on an x64-based system.

Post upgrade, Kaspersky Internet Security refused to update its database and had to be completely removed and reinstalled. Although I did not realize it at the time I made my choice, giving up Windows Media Center was a major loss. I thought that WinTV8, which I was able to obtain free from Hauppauge, would be a worthy replacement but I was wrong. Sorry Hauppauge, but compared to Media Center it sucks. Especially missed from Media Center will be the very handy scheduled recording tool built into the excellent channel guide. I had also forgotten about Media Center’s ability to create ad hoc playlists based on artist or genre from my networked Mp3 collection.


Would I do it again? Of course I would. After making all the post upgrade privacy tweaks necessary to prevent your computer from becoming a smartphone that shouts out your location and web-surfing habits to the world, Windows 10 is a leaner, faster Windows free of see-through dialog boxes and rounded corners on everything. It is what Windows 8 should have been. The upgrade process itself, at least for me, was very flawed. I was an Ubuntu user for many years and Ubuntu’s twice a year upgrade process was like a cleaning at the dentist. The Windows 10 upgrade was like a root canal. Hopefully Microsoft’s promised ongoing update process will be less painful.

Alan Cooper – Persona grata

It has been twenty years since I first read Alan Cooper’s book “About Face – The Essentials of User Interface Design”. At the time I was working as a product manager for a division of a Fortune 500 company. I had become interested in programming while working on a part number cross-reference tool for our order desk. I took Cooper’s book with me on a 10-day business trip. Reading it was instrumental in starting a career change for me from marketing to IT.

At the time he wrote “About Face”, Alan Cooper was best known as the “Father of Visual Basic”. For anyone who had learned programming using a command-line style C compiler, the VB IDE was a breath of fresh air and won Cooper awards. The VB IDE was an early example of Cooper’s obsession with making user interfaces more intuitive. In the case of VB, the users were programmers who were using VB to create user interfaces for others. Over the years Cooper has written more books on the same subject (interface design) including “The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity” and three more editions of “About Face”.

In Cooper’s explanation of what “About Face” is about he says, in part, “To those who are intrigued by the technology – which includes most of us programmer types – we share a strong tendency to think in terms of functions and features. This is only natural, since this is how we build software: function by function. The problem is that this isn’t how users want to use software.”

Cooper now runs a San Francisco design firm specializing in “Interaction Design”. The normal development cycle for computer software goes something like this: gather requirements, design, code, deploy, train and support users, repeat. Cooper instead focuses on designing software systems around real-world usability from the very beginning, lessening the need for training, support and repeat cycles. While software design tools such as the Unified Modelling Language (UML) do use “actors” to represent users that interact with a software system, those “actors” can be very generic and can even simply represent other systems interacting with software. Cooper and his design team have taken the “interaction” concept a step beyond by creating what they refer to as “personas” to represent not just human users of a system, but human users with varying skill levels and different tasks to perform and different needs from the system. Designers are encouraged to give the personas names and personalities to help them identify with them and think about how they would be using the software.

Critics have pointed out the similarities between Cooper’s methods and methods used by marketing. Nevertheless, using personas to design for a wide swath of possible users — such as may be the case with web applications — makes more sense than simply designing for a generic “user” and is simpler and less costly than employing focus groups or surveys.

Many thanks to Alan Cooper for helping me decide to make a career change those many years ago and for continuing to be a source of new ideas and inspiration.

Collaboration v. having collaboration software

Image courtesy of PinkBlue at

Image courtesy of PinkBlue at

The word collaboration is derived from the Latin word “collaboratus” meaning “to work together”. Collaboration is one of those words thrown around at job interviews along with “team” and “fit”. Most companies claim to have a collaborative culture, an environment where everyone’s input is valued. Few if any would admit to having a top-down or dictatorial culture. How an organization uses (or chooses not to use) collaboration software can tell you a great deal about the real culture there.

Fifteen years ago I was teaching programming and database theory at a small technical college. One day the boss dropped a CD on my desk and said he had a new project for me. The CD was an installation disk for Lotus Notes 4.5 and Domino Server. He wanted a Notes/Domino system to replace the smorgasbord of applications being used to handle the school’s administration. In a few months we had two Domino servers replicating, a dozen or so Notes databases in use — even a Notes form online for taking applications directly from the website (a big deal at the time). While learning about the Notes Access Control List, I hit upon the idea of creating a Notes database to which everyone working at the school had access, a place where one could post questions and ideas and others could reply with their own comments and ideas using the Notes threaded discussion model. I called it “The Water Cooler”. I still remember its old eight dot three file name: “watrcool.nsf”. When all was ready, I circulated an email about the new Notes database explaining its purpose and inviting people to start posting. Nothing happened. Puzzled, after a week or so I went around the office and personally showed people how to use it and even reminded everyone about it at a meeting. Still nothing. Miffed by now, I took a couple of my closer co-workers aside and asked them why they weren’t using the new database. The responses were all the same: management might take exception to an idea or comment and then there could be “trouble”. Not exactly a collaborative environment. I left a few months later and as far as I know, the collaboration tool was never used.

Fast forward ten years. The Internet and social media were now a part of everyday life for most people. I had just started work as a database analyst at an organization with about 160 networked employees. I was happy to learn that SharePoint, Microsoft’s off-the-shelf intranet tool, was in use. As I had spent a few years doing web development, I was eager to put my skills to use. I soon found out that one person, with very little web experience, administered SharePoint and controlled everything that went on it. SharePoint was mostly being used to push out Office documents and post a few links. When I joined a departmental group I noticed that some members had avatars but I could not create one for myself. I was told that the existing avatars had been created before management found out about and turned off the ability to create personal sites. Marketing had quietly gone about learning and using workflows, but other than that SharePoint was strictly top-down. The way it was deployed and used mirrored the corporate culture. Lots of potential, no collaboration. I no longer work there.

We are now on the cusp of 2015. The generation just entering the workforce has literally grown up with social media. Facebook has 1.3 billion active users. That is 18% of the entire human population or one in every five people on planet Earth. Twitter has 271 million active users. LinkedIn reports over 259 million users in over 200 countries. It is safe to say that not only is the Internet here to stay but so is social networking. There are now rumours that a business version of Facebook is being developed. The Internet’s powerful decentralized model has rolled over the music industry, the publishing industry, the entertainment industry, the retail industry and the financial industry. The Internet model flattens hierarchies, viewing all connections as peers. Will your industry be next?

Deploying collaboration software at your workplace can transform the way you do business –- if you let it — or you can throttle it, restrict its use and dumb it down until it is largely ineffective. What is your collaboration software deployment going to say about your workplace?

Jeremy Rifkin: A Positive Futurist

Economics has a well-deserved reputation as “the dismal science”. Whether it is uncontrolled growth or worldwide depression, all futures envisioned by economists seem dystopian. The world is going to hell in a basket of goods.

Amongst all the doom and gloom Jeremy Rifkin represents a breath of fresh air. He is an economist with a positive view of the future. I happened upon an interview with Rifkin about his book “The Third Industrial Revolution: How Lateral Power is Transforming Energy, The Economy, and The World” and was mesmerized by his sweeping historical perspective. The interview is available on YouTube by following this link:

A lecturer at the Wharton School at the University of Pennsylvania, Rifkin has authored 20 books. He is the founder of The Foundation on Economic Trends ( and is an advisor to the EU.

A recurring theme in Rifkin’s recent books is the transformative power the Internet is exerting as a social and economic leveller and as a model for a new economy. In his latest book “The Zero Marginal Cost Society: The Internet of Things, The Collaborative Commons, and The Eclipse of Capitalism” Rifkin goes into detail about how the Internet has transformed commerce, the entertainment industry and the publishing industry and is beginning to transform the way we manufacture goods, the way we deliver services, the way we work and the way we produce and consume energy.

Jeremy Rifkin will make you look forward to the future.

Proactive IT

At an IT department meeting I attended, we were asked to describe what made a work order a “quality” work order. I guess the boss was hoping we would list all the wonderful ways that we would help someone solve a problem, or the process we would use to expedite things and meet service level agreements for response time and time to satisfactory resolution. My suggestion did not go over well. I suggested that the perfect work order was one that did not exist in the first place. I think this was misinterpreted to mean that I was lazy and hoped there would be no work orders. What I meant was that if the IT department was doing its job, there would be no problems to fix, no calls to the help desk, no work orders to fill. The needs of the business would have been anticipated and fulfilled seamlessly in the background before problems could arise.

Most IT departments began from a need to have people with the technical skills to setup and maintain the computers and networks used by a business. When the business grew to the point that a call for help over the cubicle wall was no longer practical, help desk systems developed using phone calls and emails and specialized software to queue, track and document requests for technical assistance. As efficient as these systems can become and as skilled and sincere as tech support can be there is still a major problem. This type of system is reactive. Nothing gets done until someone requests help. If too many requests for the same type of help happen then causes are investigated and repairs are made or training is delivered. What is needed is a proactive approach to IT.

In manufacturing, many might visualize quality control as testing and measuring items coming off the line and discarding those that are defective or sub-standard. This is very inefficient and costly. It is a reactive approach. What is required is a thorough understanding of the system that produces the items, and procedures in place that prevent the production of faulty items in the first place: a proactive approach. In IT, many might visualize quality control as tracking how quickly and effectively IT staff responds to requests for assistance. This is a reactive approach. What is required is a proactive approach: making requests for help unnecessary in the first place.

If the IT department is to free itself from the task of putting out fires and become an active participant in business it will have to analyze and understand the business systems that are in place and what their needs are and will be. If the IT department is to lose the geek stigma it carries, it will have to start participating in the strategic planning process instead of being just a supplier of services. The IT department has to become an asset instead of an expense. Unless this happens, IT departments will be progressively marginalized and eventually outsourced as business departments increasingly manage their own information needs.

Is your IT department obsolete?

I grew up in the 1950’s when network television was booming. My dad was an early adopter and always had to have the newest and latest version of everything, so we had a “television set” for as long I can remember. In order to watch something, you had to become expert at things like fine tuning, brightness and contrast, focusing, and adjusting the horizontal and vertical hold. If you were one of the elite with your own antenna tower and rotor you learned to aim your directional antenna just so in order to catch that show that was being broadcast by a distant station. In those days, because the technology was relatively new, the hardware was at the centre of things. You turned on “the TV” to see what was available to watch on the broadcaster’s schedule. Today, after over half a century of evolution, one no longer watches “the TV”. Television as a technology is no longer about the hardware or even the delivery system. Television is about the content. People now simply watch “TV”.

Personal computers have gone through a similar growth process. Almost a quarter-century ago I bought my first PC — a 25 MHz 486 with 2 MB of RAM and a 250 MB hard drive for the princely sum of $2200 — and dutifully learned about autoexec.bat files and Lotus 123 macros. I learned about defragging and installing software from floppy disks (Lotus SmartSuite used 28 of them) and setting the screen resolution for that big new 15 inch monitor. Today most of those tasks are no longer necessary or are automated. Here at home we have six machines on our network that are largely specialized including a couple of NAS devices, a dedicated machine for watching streaming video content and a Chromebook for checking the weather while breakfast is warming, or finding a recipe. We no longer “use the computer”; instead we look something up on “the Internet”. Like television before it, it is no longer about computer hardware but about the content.

Many IT departments began in the same era during which I bought my first PC. People I worked with back then often did not have a PC at home. Their only exposure to computing was at work. The support desk was a busy place. The IT department was responsible for keeping the computers running and connected. Some places I have worked, IT was even responsible for the phones and printers (they were networked after all). Lately, the IT industry newsletters that I subscribe to online have been filled with headlines and quotes like the following:

• “Where did all the on-premises IT jobs go?” – InfoWorld newsletter
• “With minimal IT intervention, you can set up a self-service BI environment…” – TDWI
• “Software as a service vs. old-school IT” –

In spite of resistance and denial from “old-school” IT departments that still see themselves as the keepers of the keys, today’s knowledgeable and connected business users are doing an end-around and bypassing the IT department in their pursuit of useful and timely business information and applications. The help desk has been outsourced, the equipment is leased from and maintained by a third party, and increasingly even the data and the applications are stored and run in the remote data centres of contracted cloud providers. There is a surprising amount of denial and trash-talk about cloud computing on IT industry forums — mostly along the lines of cloud computing and big data being just the latest fad. Unfortunately many IT departments that are stuck in the past have become “computer departments” or “network departments” or “data departments” instead of “Information Technology departments”. They have lost sight of the fact that it is no longer about the hardware and software but about the content.

Take an honest look at your own IT department. What does it do right now that cannot be outsourced? How easy is it to obtain the information you need to do your job? Do the members of your IT department have the skills necessary to support your organizational information needs and if so, are they allowed to use those skills?