Author Archives: admin

Raspberry Pi Web Server

piServer

If you do any serious web development using PHP and MySQL, you will eventually realize the need for a test server on your local network. For years I used a re-purposed PC running Ubuntu Server. I would code the pages using Visual Studio running on my Windows workstation and then upload them to my test server. I used MySQL Workbench to connect to the MySQL database running on the server. The old PC finally died and I was casting around for alternatives when I happened upon the chapter “The Pi as a Web Server” in the Raspberry Pi User Guide by Eben Upton and Gareth Halfacree.

If you are not familiar with the Raspberry Pi, briefly it is a tiny computer on a single circuit board that was developed in the UK as an educational tool to make learning about computers and programming affordable for schools, hobbyists and the “Maker” community.

Getting the hardware

Raspberry Pi’s are readily available on Amazon and eBay. I bought a Raspberry Pi 2, Model B packaged with a plastic case and a power supply for around $50 Canadian. This particular model has a quad-core 900 MHz processor, 1 GB of RAM, a full HDMI port, an Ethernet port and 4 USB ports. The other connectors on the board will not be used for this application. The power supply connects via a micro-USB plug much like a smart phone.

The Pi uses a micro-SD card to store the operating system, software and files. A minimum capacity of 4 GB is recommended. I purchased a blank 32 GB micro-SD card along with an SD adapter to connect to my PC.

To get up and running you will need a USB keyboard and mouse and an HDMI cable to connect to an HDMI port on a TV or an HDMI to DVI adapter cable to connect to a monitor. VGA is not supported, although you can buy an HDMI to VGA adapter, they are quite expensive. I bought an HDMI to DVI cable online for under $10. You will also need a network cable to connect to your router.

Installing the software

Some Raspberry Pi’s are sold bundled with the NOOBS installer utility preloaded on a micro-SD card. If you did not purchase a micro-SD card with the installer software preloaded, you will need to install the operating system on the micro-SD card. The card can be flashed, but the easiest method is to use the NOOBS installer utility. It is a free download. If you have a blank card you will have to format it first using SD Formatter, also a free download. Once your card is ready, the NOOBS utility can be unzipped and copied to it.

Insert your prepared micro-SD card into the port on the board. Connect your USB keyboard and mouse. Connect your TV or monitor to the HDMI port. Connect your Pi to your router with the network cable. During setup, the Pi will connect to the Internet to download updates. Later on you will need to set a static IP address for the Pi in its role as a web server, but for now it is best to make sure that DHCP is enabled on your router so the Pi can grab an IP address for immediate use. With all connections made, plug in the power supply and the Pi will start up.

The NOOBS setup utility will display on your screen offering you a choice of operating systems to install. For use as a web test server, Raspbian is best. This is a version of Debian Linux modified for the Raspberry Pi. The Pi will restart to the Raspbian desktop after installation of the operating system is complete. For use as a web test server a GUI consumes resources unnecessarily. It is best to change settings so that the Pi boots up to a command line. The rest of the setup is done from there. Raspbian’s command line uses the “apt-get” tool you will be familiar with if you are a Debian or Ubuntu user. If you need a tutorial on using the Linux command line and apt-get there are plenty available online.

Use apt-get to install Apache web server, MySQL and PHP. They will be downloaded automatically. Make sure to record your username and password for MySQL. You will need them later to connect to MySQL remotely via a GUI tool running on your workstation. There are plenty of these tools available as free downloads. At this point, after using ifconfig to find out the temporary IP address assigned to the Pi by DHCP, you should be able to view the Apache and PHP confirmation pages from a browser over the network, indicating that your web server is up and running.

Final steps

Your web server needs a host name and a fixed IP address. You will also have to set the time zone and locale. These tasks can be done from the command line using the built-in raspi-config utility and nano text editor. I turned off SSH as a security measure and since I did not plan to run commands remotely on the Pi.

The most time-consuming step for me was setting up a way to see files on the Pi over the network from my Windows workstation. I used apt-get to install Samba, which allows the creation of a “share” – a folder on the Pi that can be mapped as a network drive on your workstation. There are a few steps here and I went through a lot of trial and quite a bit of error. I created a Linux user matching my Windows username and password. I then created a Samba user with the same name and password. I gave the Linux user ownership of the /var/www/html folder where the Apache server keeps the web files. An smb.conf file was created using the nano text editor. This configuration file defines the Samba “share” that you want to access, in this case the /var/www/html folder. If all works correctly, when you try to access the mapped drive over the network, the credentials you used to log into Windows will be passed to the Pi, first as a Samba user and then as a Linux user and you will be able to add and delete files from the shared folder.

Once all the setup is complete the keyboard, mouse and monitor can be disconnected. The Pi draws so little power that I just leave it running. With some work and minimal investment I now have a simple, reliable test server with a tiny physical and power consumption footprint.

References:

http://linuxcommand.org/index.php
https://help.ubuntu.com/lts/serverguide/apt-get.html
http://projpi.com/diy-home-projects-with-a-raspberry-pi/pi-web-server/
http://pimylifeup.com/raspberry-pi-web-server/
https://www.samba.org/samba/docs/using_samba/ch06.html
https://pi-hole.net/faq/how-do-i-set-a-static-ip-address-in-raspbian-jessie-using-etcdhcpcd-conf/

Windows 10 Upgrade Experiences

Despite being a satisfied Windows 7 user for a while, I bit and decided to take advantage of Microsoft’s offer of a free upgrade to Windows 10. Although I only upgraded two machines, my experiences may be useful to others contemplating the switch. Of course this does not apply to corporate fleets of PCs where a re-imaging process will no doubt be used. The machines I upgraded were a work machine for my home-based consulting business and a machine used for home entertainment.

Machine #1: workstation

The first machine on my network to show up as “ready to upgrade” to Windows 10 was an HP Z400 Workstation with a 2.53GHz Intel Xeon processor, 4 GB of RAM, running the Windows 7 Professional 64-bit operating system. This machine was purchased refurbished from a large, reputable retailer of computers, computer parts and accessories. It came with Windows 7 Professional 64-bit already installed and aside from a slower, cheaper hard drive hardware was all original. It had been running flawlessly for just over a year in its role of everyday office computer for tasks such as email, website maintenance and application development.

The Windows 10 upgrade for this machine was not quite the one-click process described in Microsoft’s marketing copy, but was not all that difficult. Despite almost always leaving the computer on overnight to run automatic updates and virus scans, the Windows 10 upgrade process stopped partway through and informed me that the machine could not be upgraded at this time. After searching a few sources online, I followed a suggestion to run Windows updates manually and try again, which I did – a couple of times eventually. It does, of course, make sense that all updates should be in place before proceeding, but the Windows 10 upgrade process did not identify this issue ahead of time, but rather stopped partway through.

Once it was finally underway for real, the upgrade process took about an hour with multiple restarts. I followed the on-screen suggestion to relax and do something else while the process ran. After the upgrade, all software that was installed before ran fine, although I had to immediately update Kaspersky Internet Security and restart yet again. The Settings screen shows the operating system as Windows 10 Pro.

Machine #2: entertainment system

The second machine to show up as “ready to upgrade” to Windows 10 was a small form factor HP Compac dc5800 with a 2.33 GHz Intel Core Duo processor, 4 GB of RAM, running the Windows 7 Home Premium 32-bit operating system. This machine was also purchased refurbished from the same retailer as the workstation and with the operating system already installed on the original 80 GB HDD. To suit its role as an entertainment machine I had added more RAM, a mid-quality PCI-Express video card with HDMI output, a Hauppauge tuner card, a second 320 GB HDD for storing recorded video and a Logitech wireless touch keyboard. This machine stays permanently connected to our TV and although it is obviously not up to gaming, it functions perfectly well for watching streaming content and cable TV programming.

The Windows 10 upgrade process for this machine was far from straight-forward, despite my attempt at learning from my first upgrade experience and running all updates manually first. After starting the upgrade process and watching it run for a few minutes while it downloaded the necessary files, the process stopped abruptly with the message “windows 10 cannot update system reserved partition”. Another try returned the same result. This sounded like a permissions problem to me at first, which puzzled me since I am administrator on the machine. After searching online I learned it was not a permissions problem but a space problem. Sure enough, Disk Management showed that my system reserved partition was less than 100 MB. Not wanting (actually too lazy) to go the command-line route, I followed the directions at this link and used the method suggested by blog post author Buck Hodges and downloaded and installed MiniTool Partition Wizard Free. The tool’s interface is not the most intuitive and at times had me thinking about going the command line route after all, but I eventually successfully enlarged my system reserved partition. Hodges enlarged his system reserved partition to 300MB. Since I wasn’t concerned about disk space on this machine, I enlarged mine to 650 MB just in case. (Post upgrade, the system reserved partition shows 266 MB of free space).

Full of confidence now, I started the upgrade process again and it got past the previous error message. I walked away and came back an hour later expecting to find the process finished. Instead, it had stopped with a dialog box asking me to acknowledge that I understood that Windows Media Center was not going to be installed. I remembered reading about Windows 10 no longer including Windows Media Center, but I had assumed that since it was already installed it would simply carry over like my other software. It was not to be, but at this point I was not in the mood to turn back and accepted. (More about this later.) When the upgrade on this machine finished, the Settings screen showed the operating system as 32-bit Windows 10 Home on an x64-based system.

Post upgrade, Kaspersky Internet Security refused to update its database and had to be completely removed and reinstalled. Although I did not realize it at the time I made my choice, giving up Windows Media Center was a major loss. I thought that WinTV8, which I was able to obtain free from Hauppauge, would be a worthy replacement but I was wrong. Sorry Hauppauge, but compared to Media Center it sucks. Especially missed from Media Center will be the very handy scheduled recording tool built into the excellent channel guide. I had also forgotten about Media Center’s ability to create ad hoc playlists based on artist or genre from my networked Mp3 collection.

Conclusions

Would I do it again? Of course I would. After making all the post upgrade privacy tweaks necessary to prevent your computer from becoming a smartphone that shouts out your location and web-surfing habits to the world, Windows 10 is a leaner, faster Windows free of see-through dialog boxes and rounded corners on everything. It is what Windows 8 should have been. The upgrade process itself, at least for me, was very flawed. I was an Ubuntu user for many years and Ubuntu’s twice a year upgrade process was like a cleaning at the dentist. The Windows 10 upgrade was like a root canal. Hopefully Microsoft’s promised ongoing update process will be less painful.

Alan Cooper – Persona grata

It has been twenty years since I first read Alan Cooper’s book “About Face – The Essentials of User Interface Design”. At the time I was working as a product manager for a division of a Fortune 500 company. I had become interested in programming while working on a part number cross-reference tool for our order desk. I took Cooper’s book with me on a 10-day business trip. Reading it was instrumental in starting a career change for me from marketing to IT.

At the time he wrote “About Face”, Alan Cooper was best known as the “Father of Visual Basic”. For anyone who had learned programming using a command-line style C compiler, the VB IDE was a breath of fresh air and won Cooper awards. The VB IDE was an early example of Cooper’s obsession with making user interfaces more intuitive. In the case of VB, the users were programmers who were using VB to create user interfaces for others. Over the years Cooper has written more books on the same subject (interface design) including “The Inmates Are Running the Asylum: Why High-Tech Products Drive Us Crazy and How to Restore the Sanity” and three more editions of “About Face”.

In Cooper’s explanation of what “About Face” is about he says, in part, “To those who are intrigued by the technology – which includes most of us programmer types – we share a strong tendency to think in terms of functions and features. This is only natural, since this is how we build software: function by function. The problem is that this isn’t how users want to use software.”

Cooper now runs a San Francisco design firm specializing in “Interaction Design”. The normal development cycle for computer software goes something like this: gather requirements, design, code, deploy, train and support users, repeat. Cooper instead focuses on designing software systems around real-world usability from the very beginning, lessening the need for training, support and repeat cycles. While software design tools such as the Unified Modelling Language (UML) do use “actors” to represent users that interact with a software system, those “actors” can be very generic and can even simply represent other systems interacting with software. Cooper and his design team have taken the “interaction” concept a step beyond by creating what they refer to as “personas” to represent not just human users of a system, but human users with varying skill levels and different tasks to perform and different needs from the system. Designers are encouraged to give the personas names and personalities to help them identify with them and think about how they would be using the software.

Critics have pointed out the similarities between Cooper’s methods and methods used by marketing. Nevertheless, using personas to design for a wide swath of possible users — such as may be the case with web applications — makes more sense than simply designing for a generic “user” and is simpler and less costly than employing focus groups or surveys.

Many thanks to Alan Cooper for helping me decide to make a career change those many years ago and for continuing to be a source of new ideas and inspiration.

Collaboration v. having collaboration software

Image courtesy of PinkBlue at FreeDigitalPhotos.net

Image courtesy of PinkBlue at FreeDigitalPhotos.net


The word collaboration is derived from the Latin word “collaboratus” meaning “to work together”. Collaboration is one of those words thrown around at job interviews along with “team” and “fit”. Most companies claim to have a collaborative culture, an environment where everyone’s input is valued. Few if any would admit to having a top-down or dictatorial culture. How an organization uses (or chooses not to use) collaboration software can tell you a great deal about the real culture there.

Fifteen years ago I was teaching programming and database theory at a small technical college. One day the boss dropped a CD on my desk and said he had a new project for me. The CD was an installation disk for Lotus Notes 4.5 and Domino Server. He wanted a Notes/Domino system to replace the smorgasbord of applications being used to handle the school’s administration. In a few months we had two Domino servers replicating, a dozen or so Notes databases in use — even a Notes form online for taking applications directly from the website (a big deal at the time). While learning about the Notes Access Control List, I hit upon the idea of creating a Notes database to which everyone working at the school had access, a place where one could post questions and ideas and others could reply with their own comments and ideas using the Notes threaded discussion model. I called it “The Water Cooler”. I still remember its old eight dot three file name: “watrcool.nsf”. When all was ready, I circulated an email about the new Notes database explaining its purpose and inviting people to start posting. Nothing happened. Puzzled, after a week or so I went around the office and personally showed people how to use it and even reminded everyone about it at a meeting. Still nothing. Miffed by now, I took a couple of my closer co-workers aside and asked them why they weren’t using the new database. The responses were all the same: management might take exception to an idea or comment and then there could be “trouble”. Not exactly a collaborative environment. I left a few months later and as far as I know, the collaboration tool was never used.

Fast forward ten years. The Internet and social media were now a part of everyday life for most people. I had just started work as a database analyst at an organization with about 160 networked employees. I was happy to learn that SharePoint, Microsoft’s off-the-shelf intranet tool, was in use. As I had spent a few years doing web development, I was eager to put my skills to use. I soon found out that one person, with very little web experience, administered SharePoint and controlled everything that went on it. SharePoint was mostly being used to push out Office documents and post a few links. When I joined a departmental group I noticed that some members had avatars but I could not create one for myself. I was told that the existing avatars had been created before management found out about and turned off the ability to create personal sites. Marketing had quietly gone about learning and using workflows, but other than that SharePoint was strictly top-down. The way it was deployed and used mirrored the corporate culture. Lots of potential, no collaboration. I no longer work there.

We are now on the cusp of 2015. The generation just entering the workforce has literally grown up with social media. Facebook has 1.3 billion active users. That is 18% of the entire human population or one in every five people on planet Earth. Twitter has 271 million active users. LinkedIn reports over 259 million users in over 200 countries. It is safe to say that not only is the Internet here to stay but so is social networking. There are now rumours that a business version of Facebook is being developed. The Internet’s powerful decentralized model has rolled over the music industry, the publishing industry, the entertainment industry, the retail industry and the financial industry. The Internet model flattens hierarchies, viewing all connections as peers. Will your industry be next?

Deploying collaboration software at your workplace can transform the way you do business –- if you let it — or you can throttle it, restrict its use and dumb it down until it is largely ineffective. What is your collaboration software deployment going to say about your workplace?

Jeremy Rifkin: A Positive Futurist

Economics has a well-deserved reputation as “the dismal science”. Whether it is uncontrolled growth or worldwide depression, all futures envisioned by economists seem dystopian. The world is going to hell in a basket of goods.

Amongst all the doom and gloom Jeremy Rifkin represents a breath of fresh air. He is an economist with a positive view of the future. I happened upon an interview with Rifkin about his book “The Third Industrial Revolution: How Lateral Power is Transforming Energy, The Economy, and The World” and was mesmerized by his sweeping historical perspective. The interview is available on YouTube by following this link:

A lecturer at the Wharton School at the University of Pennsylvania, Rifkin has authored 20 books. He is the founder of The Foundation on Economic Trends (www.foet.org) and is an advisor to the EU.

A recurring theme in Rifkin’s recent books is the transformative power the Internet is exerting as a social and economic leveller and as a model for a new economy. In his latest book “The Zero Marginal Cost Society: The Internet of Things, The Collaborative Commons, and The Eclipse of Capitalism” Rifkin goes into detail about how the Internet has transformed commerce, the entertainment industry and the publishing industry and is beginning to transform the way we manufacture goods, the way we deliver services, the way we work and the way we produce and consume energy.

Jeremy Rifkin will make you look forward to the future.

Proactive IT

At an IT department meeting I attended, we were asked to describe what made a work order a “quality” work order. I guess the boss was hoping we would list all the wonderful ways that we would help someone solve a problem, or the process we would use to expedite things and meet service level agreements for response time and time to satisfactory resolution. My suggestion did not go over well. I suggested that the perfect work order was one that did not exist in the first place. I think this was misinterpreted to mean that I was lazy and hoped there would be no work orders. What I meant was that if the IT department was doing its job, there would be no problems to fix, no calls to the help desk, no work orders to fill. The needs of the business would have been anticipated and fulfilled seamlessly in the background before problems could arise.

Most IT departments began from a need to have people with the technical skills to setup and maintain the computers and networks used by a business. When the business grew to the point that a call for help over the cubicle wall was no longer practical, help desk systems developed using phone calls and emails and specialized software to queue, track and document requests for technical assistance. As efficient as these systems can become and as skilled and sincere as tech support can be there is still a major problem. This type of system is reactive. Nothing gets done until someone requests help. If too many requests for the same type of help happen then causes are investigated and repairs are made or training is delivered. What is needed is a proactive approach to IT.

In manufacturing, many might visualize quality control as testing and measuring items coming off the line and discarding those that are defective or sub-standard. This is very inefficient and costly. It is a reactive approach. What is required is a thorough understanding of the system that produces the items, and procedures in place that prevent the production of faulty items in the first place: a proactive approach. In IT, many might visualize quality control as tracking how quickly and effectively IT staff responds to requests for assistance. This is a reactive approach. What is required is a proactive approach: making requests for help unnecessary in the first place.

If the IT department is to free itself from the task of putting out fires and become an active participant in business it will have to analyze and understand the business systems that are in place and what their needs are and will be. If the IT department is to lose the geek stigma it carries, it will have to start participating in the strategic planning process instead of being just a supplier of services. The IT department has to become an asset instead of an expense. Unless this happens, IT departments will be progressively marginalized and eventually outsourced as business departments increasingly manage their own information needs.

Is your IT department obsolete?

I grew up in the 1950’s when network television was booming. My dad was an early adopter and always had to have the newest and latest version of everything, so we had a “television set” for as long I can remember. In order to watch something, you had to become expert at things like fine tuning, brightness and contrast, focusing, and adjusting the horizontal and vertical hold. If you were one of the elite with your own antenna tower and rotor you learned to aim your directional antenna just so in order to catch that show that was being broadcast by a distant station. In those days, because the technology was relatively new, the hardware was at the centre of things. You turned on “the TV” to see what was available to watch on the broadcaster’s schedule. Today, after over half a century of evolution, one no longer watches “the TV”. Television as a technology is no longer about the hardware or even the delivery system. Television is about the content. People now simply watch “TV”.

Personal computers have gone through a similar growth process. Almost a quarter-century ago I bought my first PC — a 25 MHz 486 with 2 MB of RAM and a 250 MB hard drive for the princely sum of $2200 — and dutifully learned about autoexec.bat files and Lotus 123 macros. I learned about defragging and installing software from floppy disks (Lotus SmartSuite used 28 of them) and setting the screen resolution for that big new 15 inch monitor. Today most of those tasks are no longer necessary or are automated. Here at home we have six machines on our network that are largely specialized including a couple of NAS devices, a dedicated machine for watching streaming video content and a Chromebook for checking the weather while breakfast is warming, or finding a recipe. We no longer “use the computer”; instead we look something up on “the Internet”. Like television before it, it is no longer about computer hardware but about the content.

Many IT departments began in the same era during which I bought my first PC. People I worked with back then often did not have a PC at home. Their only exposure to computing was at work. The support desk was a busy place. The IT department was responsible for keeping the computers running and connected. Some places I have worked, IT was even responsible for the phones and printers (they were networked after all). Lately, the IT industry newsletters that I subscribe to online have been filled with headlines and quotes like the following:

• “Where did all the on-premises IT jobs go?” – InfoWorld newsletter
• “With minimal IT intervention, you can set up a self-service BI environment…” – TDWI
• “Software as a service vs. old-school IT” – InfoWorld.com

In spite of resistance and denial from “old-school” IT departments that still see themselves as the keepers of the keys, today’s knowledgeable and connected business users are doing an end-around and bypassing the IT department in their pursuit of useful and timely business information and applications. The help desk has been outsourced, the equipment is leased from and maintained by a third party, and increasingly even the data and the applications are stored and run in the remote data centres of contracted cloud providers. There is a surprising amount of denial and trash-talk about cloud computing on IT industry forums — mostly along the lines of cloud computing and big data being just the latest fad. Unfortunately many IT departments that are stuck in the past have become “computer departments” or “network departments” or “data departments” instead of “Information Technology departments”. They have lost sight of the fact that it is no longer about the hardware and software but about the content.

Take an honest look at your own IT department. What does it do right now that cannot be outsourced? How easy is it to obtain the information you need to do your job? Do the members of your IT department have the skills necessary to support your organizational information needs and if so, are they allowed to use those skills?

Toward Business in Real-Time

In 2002 I attended the PeopleSoft on Tour conference in Toronto. This was before the merger with JD Edwards and the subsequent acquisition by Oracle. The software industry was still trying to recover from its recent dot-com bubble. I had gone to the conference with the naive hope of learning more about PeopleSoft’s products and possibly landing a job working with them.

Dashed hopes notwithstanding, I was inspired by the keynote address presented by the then CEO of PeopleSoft Craig Conway. The theme of his address was “The Real-Time Enterprise” and it still reverberates today. Conway recapped the history of business reporting.

Annual reports came first, prepared for the traditional annual meeting and often synched with and necessitated by accounting’s annual tax reporting cycle. In fact, most of the content for the annual reports came straight from the accounting department. Annual reports for an enterprise were and still are also used for shareholder and board meetings. Although nice if the news from the previous year was good, annual reports are not very helpful in correcting problems before they get worse when the news is bad. It’s hard to imagine today, but at one time these reports had to be compiled manually.

As computerization made reporting faster and easier, reports for the quarterly board meeting became the norm. It soon followed logically that monthly reports would be even more useful. Monthly reports allowed managers to make mid-stream adjustments based on the quarterly board meetings to help avoid having to report something bad at the end of the year. Reporting had moved from the boardroom to the manager’s office.

Reporting cycle times have hovered around 24 hours or less for decades now. Most companies have data up to midnight of the previous day available for reporting and planning purposes. Conway’s vision of a “Real-Time Enterprise” involved reducing the time of reporting cycle times to zero, showing the actual current state of the company.

Many companies — even those not using PeopleSoft/Oracle products — now have “dashboards” that display KPI’s on one convenient screen, mimicking manufacturing process control systems, although unlike those manufacturing control systems, most dashboards are not yet truly real-time. Users now have information at their fingertips that allows them to steer companies with more control than ever before. In just over ten years’ time, the vision of the real-time enterprise has almost become reality.

US Government Surveillance – A Historical Perspective

In 1925, during Prohibition in the US, former police officer Roy Olmstead was convicted of running a huge bootlegging operation in the Seattle, Washington area. The conviction was largely based on transcripts of wire-taps on Olmstead’s telephone, a relatively new evidence-gathering technique. Olmstead appealed his conviction all the way to the Supreme Court. Olmstead claimed that the wire-taps were in violation of his rights under the Fourth Amendment to the US Constitution which protects US citizens against unreasonable search and seizure.

In 1928, the Supreme Court upheld Olmstead’s conviction, but Justice Louis D. Brandeis gave this prophetic dissenting opinion:

“The progress of science in furnishing the Government with means of espionage is not likely to stop with wire-tapping. Ways may someday be developed by which the Government, without removing papers from secret drawers, can reproduce them in court, and by which it will be enabled to expose to a jury the most intimate occurrences of the home.

…if the government becomes a lawbreaker, it breeds contempt for law; it invites every man to become a law unto himself; it invites anarchy. To declare that in the administration of the criminal law the end justifies the means—to declare that the government may commit crimes in order to secure the conviction of a private criminal—would bring terrible retribution. Against that pernicious doctrine this court should resolutely set its face.”

It was not until 39 years later that the US Supreme Court changed its view on privacy. In 1967 the Supreme Court overturned the conviction of Charles Katz, who had used a telephone booth to relay bets as part of a gambling operation. The telephone booth had been bugged by the FBI using an external microphone.

Justice Stewart explained the ruling as follows:

“The Government’s activities in electronically listening to and recording the petitioner’s words violated the privacy upon which he justifiably relied while using the telephone booth and thus constituted a ‘search and seizure’ within the meaning of the Fourth Amendment.”

A year later the US Congress passed the Omnibus Crime Control and Safe Streets Act which authorized electronic eavesdropping and wire-tapping as long as a warrant was obtained.

Another 46 years down the road and things are again complicated by further technological advances:

  • Can someone talking on a cellphone in a public place justifiably expect privacy?
  • Does gathering information for later analysis constitute search and seizure?
  • Is tracking someone whose GPS-enabled smartphone is broadcasting their location illegal?

Watch this educational video on the website of the American Civil Liberties Union for a primer on what can already be done using modern technology:

https://www.aclu.org/technology-and-liberty/meet-jack-or-what-government-could-do-all-location-data

BI minus IT

As I read Business Intelligence (BI) newsletters, company websites and blogs a common theme emerges: people are sick of waiting for IT to deliver what they need because if and when they finally do it is usually way too late.

This is not new. Many years ago, while working as a product manager, I watched order desk staff struggling with catalogs the size of phone books (both thankfully now obsolete) and took some night school programming courses so I could create a computerized cross-reference of competitors’ part numbers. Thinking I would be congratulated, I was instead reprimanded. (The order desk staff thanked me.)

Flash forward a couple of decades and not much has changed. One place I worked, users knew more about SharePoint than the person who was supposed to be administering it and were quietly using features like workflows that IT considered “dangerous” (translation: features IT did not understand and did not feel they had to learn or support).

Movements like BYOD, personal clouds and self-service BI are user rebellion against the old “if it ain’t broke, don’t fix it” mentality of most IT departments. When all arguments are exhausted, IT usually falls back on the “security” catch-all objection.

Most modern business analytics tools stress the same feature: getting it done quickly without IT involvement. At this rate, most IT departments will be reduced to simply “keeping the lights on” and will become increasingly irrelevant when business strategies are being formulated and executed.