Windows 8.1 forces the user to interact in a way that doesn’t work

Problem

Wacom is world famous for digital pens and its professional Intuos series are used by millions of artists all around the world. It looks something like this and connects to the computer through a USB port:

Wacom Intuos 3

Wacom Intuos 3

After connecting a Wacom Intous 3 pen Windows took a couple of minutes to download and install some drivers. But the absolute positioning system which is one of the main features of digital pens didn’t work so I headed to Wacom’s support site and installed the latest drivers for Windows 8. So far so good. Now Windows 8.1 behaves like it’s a tablet PC with Touch interface. This is smart. It even shows this little keyboard on the taskbar that allows me to type with the pen:

Windows 8.1 touch keyboard

Windows 8.1 touch keyboard

However apparently Windows 8.1 fails to detect the touch gestures with this pen (I can live with that). The problem is when Windows tries to teach me touch gestures and the only way to get rid of this “compulsory education” is to do the gesture which is impossible with this device! Windows shows this black popover at the left side of the screen asking the user to swipe from the edge:

Windows 8.1 teaches the user to swipe the edge to switch between applications

Windows 8.1 teaches the user to swipe the edge to switch between applications

It reads like this: “Swipe in from the edge to go back to the last app you were using. Tip: begin with your finger outside of your screen”. So I tried swiping left, right, top, bottom, clicking, double clicking with pen and mouse. But nothing helped. I guess the key message is to use my “finger”. This is not OK. If the user was forgiving for the misdetection of the device, this annoying education makes them regret it. As I’m writing this, the popover is still there and there’s no way to get rid of it. So after finishing this post I’m going to try the old time Windows recipe: restart.

Solution

PS. This section is about interaction design solutions but if you are a user having this issue, the quickest solution is to move your mouse cursor to the top left of the screen.

  1. Don’t force the user into any sort of education. Sometimes they can’t do it. Sometimes they have other important things to do and want to skip it. Sometimes they already know how it works. There are tons of reasons why people don’t want to be forced into learning something so don’t force them if you want them to like your products.
  2. Do more usability testing with popular devices like Wacom. This device is being used professionally all around the world and when things don’t work out smoothly companies loose money. I would expect Windows to download the original driver or give equally well functionality. The new Windows is behaving like how Linux was behaving a few years ago: many devices didn’t have good support in Linux and when they did, often some of their features didn’t work. I haven’t used Linux lately but Mac OS X installed my Wacom automatically upon connecting it and it worked like a charm. Windows should catch up if it wants to reclaim its dominant position in the market.
  3. As one dear reader mentioned, it is possible to disable these “mandatory educations” altogether. However, that may need a little extra search. Here is a good explanation how to turn them off (Note: Group Policy only exists in Windows 8 pro). As of writing this post, that question is viewed 3704 times. This post has been visited 2541 times in less than 3 hours. So I assume this is a quite hot topic. I wonder if Windows designers will fix it. I’ll contact Microsoft and share their insight on this page. Meanwhile you can head to Hacker news and see what other people say about this issue.
  4. Add a close button to the popover to let the users close this modal, persistent, sticky popover:
Windows 8.1 educational popover with close button

Windows 8.1 educational popover with close button

Advertisements

Windows 8.1 doesn’t run apps in the corresponding screen

Problem

Windows 8.1 just like its predecessors has a good support for multiple screen. With the return of the start menu, you can “pin” apps to your taskbar. This makes an icon that allows you to quickly run an app. Even though the taskbar is exactly duplicated in every screen, clicking an app icon in one screen doesn’t guarantee that it’ll open in the same screen.

This behavior can be quite annoying specially if you’re using a projector. Here is a little image to illustrate the issue:

Image

Trying to run an app in the left screen may show the app in another screen

Mac OS X has the same issue. So when you run an app you really can’t predict where it will show up. It probably shows up where it ran last time, but that introduces an element of surprise since the user expects to see the action (click) and the effect (app) appear close to each other.

Solution

There are a few solutions to this issue.

  1. The easiest one is to run the app in the screen where the user has clicked the app button.
  2. When user clicks the button using an animation guide their eye to the screen that contains the app. Mac OS X uses this technique, though still the first solution makes it more predictable.
  3. If the designers can’t agree or confirm the design decision with usability tests, have some sort of setting in control panel that allows the user to choose weather the new apps load where they last showed up or close to where the run action is initiated.
  4. Allow the user to run the app in the current screen with some sort of right-click menu on taskbar. Something like this:

Image

iPhone doesn’t allow you to change the default hotspot name easily

Problem

We had a power outage at work and even though my work laptop has a battery but the wifi network needs electricity to work. After a few minutes of internet suffocation, I decided to share the internet on my iPhone and keep working. But it was then that I discovered iPhone wasn’t really built with the ambition of being in everybody’s hand. My company has an interesting policy of giving every employee an iPhone and apparently I wasn’t the only one who was sharing his phone’s internet. So I saw something like this when trying to connect my Mac to my iPhone:

Image

Everybody sharing their iPhone hotspot

Good for Apple, I felt like this:

Image

iPhone hotspots everywhere

So naturally I went to the most logical place to change the hotspot name to something customised in order to find it. But there’s no such settings:

Image

iPhone personal hotspot settings

There’s no such setting so the first thing I do after being connected to the Internet is search for how can it be done. Turns out you have to change your phone’s name in order to change the hotspot’s name. It may sound logical but is pretty limiting. See how HTC One allows you to choose your hotspot name (needless to say this opens up in hotspot settings where it belongs):

Image

HTC One hotspot settings

But for iPhone, you’re supposed to leave the hotspot settings, go to General > About and chance your phone’s name:

Image

iPhone 7 change name

A little unexpected but as a user put it “let down by apple yet again! i jailbroke my iphone 4 n installed an app called myWi it allows you to change ssid” (source).

The irony is that Apple is famous for paying attention to user experience but in practice there are many simple cases like this that are problematic.

Solution

Allow the users to edit their hotspot SSID name in the settings of Personal Hotspot.

HTC Sense Pushes Minimalism too far

Problem

HTC has customised the standard Android user interface with its proprietary product called Sense. This probably gives a competitive edge to HTC products, but it also introduces a level of independence from mainstream standards to HTC designers. This is powerful, but as I always say “with power comes corruption” and this one is no exception. Being the proud owner of various top HTC phones for the past 6 years, I feel disappointed about the recent bold decisions in the design of their flagship product HTC One.

The problem is that HTC designers have decided to remove one of the 3 crucial Android controls from the bottom of the screen. Most android phones have these 3 buttons as seen in the following image:

Image

From left to right they are:

  • Back: goes to the previous screen (or activity in Android programming terminology)
  • Home: returns you back to the home screen where you can see your desktop, widgets and run programs
  • Switch app: allows you to switch between running apps, kinda like ALT+TAB on PC or CMD+TAB on Mac.

Now somehow HTC designers figured out that they can remove the “Switch app” button and make room for their logo instead. The user is expected to double tap the home button in order to switch between apps. By the way the above image is from HTC One X, that is the father of HTC One. So here is how HTC One’s buttons look like:

Image

In the latest HTC phones you are supposed to double tab the home button quickly to be able to switch between apps. That is nothing new. iPhone works exactly the same. However, this has three issues:

  1. This behaviour is not consistent with other Android phones
  2. This introduces a mapping issue where two irrelevant actions are mapped to one control
  3. After 4 months of using this phone I still haven’t got used to double-tapping the right bottom corner of my phone so often so I have to use two hands (and have dropped the phone enough to justify spending 200 SEK on a protective shell)

Solution

The obvious solution is to return the Switch app button. There is a reason it was there, and let’s face it, removing a button to open up space for the logo isn’t the best way to keep the customers happy. Nor is it the best example of minimalist design.

Another solution would be to have the “Home” functionality when the user presses on the HTC logo while the Switch button is back to where it was before. So again we’ll have three buttons but the home button is “covered” with the HTC logo. That is an acceptable for the users. It’s not the best interaction but it’s still better than what we have now.

You can’t turn off the shutter sound in iPhone

Note: this article is a part of the series iOS issues.

Problem

There is no way to remove the shutter sound on an iPhone except putting the phone into silence mode! This makes a relation between two irrelevant features: your phone’s ring tone and notification sounds will be controlled by the same button that you are forced to use for shutting up the camera. If you take a photo in silent mode and forget to turn on the sound, you will probably miss your calls or notifications. If you remember to turn on the sound, you may take the next photo with the shutter sound.

A lot of people have this issue. And the only way they have found is to put the phone into silent mode. The main issue is using the same control for two different functions. This is a mapping problem.

Some people pointed out that phone companies have to embed the shutter sound by law in certain countries. Well the fact that they allow silencing it, is contradictory to this compliance.

Probably the reason is to make a sound so people will notice if you are taking a photo of them. But what if you are not taking a photo of them? From my personal experience, you’ll most probably annoy people around you when taking a photo. For example:

  • in a library you are taking a photo from a book title or magazine to read later
  • in the classroom when you want to take note from the whiteboard instead of wasting time re-writing it at the same time the teacher is teaching it
  • in the coffee shop when you want to take a photo from your receipt for archive reason
  • in your kids bedroom when you want to take a photo of your dearest while they are asleep
  • or just documenting a crime without taking getting attention or being a victim to it

There can be many other scenarios where you need to take a photo silently, but you can’t disable one of your phone’s core functionalities and remember to turn it on after that!

Solution

Add an option for turning off the camera shutter sound. Ideally with a drop down menu that allows changing the sound and silencing it.

The close button doesn’t really close applications in Mac OS X

Problem

Note: This post is part of a serie about top Mac OS X usability issues.

Having a little “X” button at the top of the application window is a de-facto standard to close an application. Just like Windows and Ubuntu, Mac OS X has these buttons too:

The minimize, maximize and close buttons in Windows 7

The close, minimize and maximize buttons in Ubuntu 12.10

The close, minimize and maximize buttons in Mac OS X Mountain Lion

But in Mac OS X, this button doesn’t actually close the application and it confuses the user. It merely minimizes the application. In some apps this stops the minimized application. In other apps there is not much difference between closing and minimizing.

Story

It’s been a while I want to write about this particular confusion but today something funny happened that made me write it now. Today Apple released Safari 6.0.2 so the App store suggested the update:

App store announces an update for Safari 6.0.2

But I was reading something on Safari so the update procedure suggested me to close the Safari:

It suggests closing Safari to continue the update

I clicked the close button but nothing happened. I remembered that the Mac OS X behaves differently so I guessed the app is not really close. In fact you can see if an app is closed or not by closely looking at the taskbar. There will be a little white light under the apps that are “closed but actually open”. For example in this picture:

How to tell which apps are actually open in Mac OS X

The first and second apps (Chrome and Safari) are open while the third and fourth apps (Firefox and Opera) are closed. You can tell it from that little tiny light under the logo of the application.

So I had to click on Safari to open it again. Then forcefully close the application by pressing CMD+Q (keyboard shortcut) or alternatively I could use the application menu next to the apple logo and choose Safari > Quit Safari.

The update happily continued and finished the job. I’m sure ordinary users who buy Apple products just because they don’t know and don’t want to learn so much about how software works, will end up restarting their computer for this update to take effect. Yes, I’m sure! I’ve seen those users.

Solution

This interaction model is so interwoven into the Mac OS X that probably Apple will never change it. They may have their own reasons for such weird behaviours but whatever it is, the result is confusion, frustration and eventually dissatisfaction from the products.

The solution is simple: close the application when the user clicks the close button! Don’t keep them in RAM.

Like many other usability issues, there are some programs that make it behave the way it is supposed to do for example RedQuits. I wouldn’t install an app to fix this misbehaviour. Why would I buy a car with an uncomfortable chair and then pay extra to switch the chair?

More

Chris Shiflett has put up a list of top 10 usability issues in Mac OS X and this close-button issue comes second in the list!

Mac OS X separates the menu from application

Problem

Note: This post is part of a serie about top Mac OS X usability issues.

Everyone who comes from the world of Windows to Mac will notice one big different: the menu bar is always separated from the application! So there is no menu bar at the top of the application GUI. The menu bar always appears at the top of the screen with the rest of the OS-specific icons and menus. Here are some images to show you what it looks like:

Firefox and its menu in Mac OS X

Finder and its menu in Mac OS X

Spotify and its menu in Mac OS X

User interface separation

This visual separation makes it hard to for the user to associate the application with the menu actions. Think about it this way: you have your car with the driver seat and everything but the steering wheel, gearbox shift, gas and brakes are separate from the car and installed on the road side! I made a little sketch to show how ridiculous this is:

This is how an Apple car may look like: driver controls are separate from the body

It would be interesting to mention that the menu at the top of the screen is not solely for this application. It also contains some menus from the operating system itself as well as general system icons!

How the OS X menu bar is shared between application and the operating system

To make the situation even worse, not all the controls are put on the menu bar. In fact there are some controls and menus in the main GUI which makes it every harder for the user to identify where the commands are hiding. Here is a pop up menu from Finder together with its menu at the top of the screen:

Besides the top screen menu bar, apps have their own menu too

One may argue that this is a popup menu and it is supposed to appear in the main interface. “yes, exactly” I would reply and with the same argument, I want to see the application-specific menu in its main interface as well!

Where does the menu belong?

Another issue is that you may have an app on your screen, but the menu is from another app! For example you may have two instances of TextEdit open. TextEdit is the Mac OS X equivalent of Windows Notepad. You are working in one document and you want to copy the selected part from another document. Something like this:

Two TextEdit windows open in Mac OS X

When you natively click on the Edit menu, you are actually clicking the edit menu from the current window. It takes an extra click on the other Window for the menu to change context. You will not see any indicator of change in the menu, you should just listen to your heart and feel the change (it’s a feedback issue). In Windows every application has its own menu inside its GUI. This simple perception mistake can be avoided in Windows:

Two Notepad windows opened in Windows 7

Extra monitor/projector is headache

This one is probably the worst side effect of having the menu separated from the GUI: when you have an extra monitor or a projector, the main menu is shown only in one of them! Yes, that’s right: it means if you have an app on one screen, you may need to move the mouse all the way to the other screen every single time you want to use the menu! “Unbelievable” for those of you who are not an Apple customer and think Apple is all about usability, “daily headache” for those of us who paid to buy an Apple Mac OS X product!

For example, lets say you have Finder open in one monitor and Opera in another monitor. While working with Finder, you decide to change some preferences in Finder, you have to bring the mouse all the way to the other monitor to access the menu from Finder:

If you have two monitors in Mac OS X, the main menu always appears on one of them.

And this is even harder when you have connected a projector because dragging the mouse to your screen has always this weird “teleport effect” where the mouse travels a few meters from the projector to your laptop screen when it reaches the edge of the screen. It is confusing, misleading, hard to use and just an embarrassing way to present your Mac at a meeting!

It is of course possible to move the menubar from one screen to another in the Mac OS X Display settings but then for the applications in the other screen you have to drag the monitor all the way to the projector.

Solutions

All the problems mentioned in this post can be simply solved by having the menubar together with the main application window. Microsoft is not exactly known as the maker of the best user interfaces but it got at least one thing right: all the pieces of the GUI are together.

Apple has always has the main menu bar like that. So the users get used to this. It is so well established that there are apps created to solve this issue (e.g. SecondBar). Given their complicated political conflicts, it would be extremely hard for Apple to behave like Microsoft. However, I don’t think Apple can have any strong argument for not putting the menubar together with the main application GUI. If they want it, they make it happen and then they can brag about adding a cool new feature to the next version of Mac OS!