How to predict a successful Google product (Hint: it’s the name)

Google announced last week that it will discontinue support for Wave, and I’m not surprised. At launch, the hype was huge and everyone was excited to bother their friends for Wave invites. But when I finally got mine, I opened Wave and thought, “What’s this? What am I supposed to do with this?”

The Wave interface is not as intuitive as what I had hoped for in a Google product. When I’m conversing in a wave, I’m never sure where I’m supposed to click or what I’m supposed to select. Judging by popularity, I don’t think I was the only one who thought so.
Continue reading

Found key item: ShiftIt

I’ve been using a Mac for the past three years, so when the slick Windows 7 Snap feature came out, I admit, I was a little sad I wasn’t in the market for a new Windows OS. Enter ShiftIt, a utility for Mac which replicates the behavior of Snap. The Shiftit dropdown menu sits in my menu bar, and I can resize windows using either the dropdown menu or shortcuts. I use it primarily when I’m writing outlines for papers, so I can have my outline, notes, and research windows sized well together.

Admittedly, you don’t need this app to resize windows such that you can see more than one at once, but it makes the process much more zippy, accurate, and convenient.

(ShiftIt works with Mac OS 10.5 and 10.6.)

My PowerPoint rule of thumb

It’s the age of criticizing PowerPoint, and everybody’s doing it. Last November I wrote a wildly popular post on why I have trouble learning from Powerpoint presentations in college classes. This week, the New York Times published an article dramatizing the problem, writing how Powerpoint is hurting the United States’ war in Iraq. Officers’ time is too tied up in making bullet-pointed storyboards, to the extent that some of them spend more time on Powerpoint than anything else. I think the NYT chose the example of armed forces to make this story especially dramatic, and it goes a little over the top. The article even mentions that Obama was briefed with PowerPoint slides in last fall’s Afghan Strategy Review, as if to say that because the President sees it, PowerPoint is a scourge that has penetrated our deepest levels of government. Still, the article says out loud what many of us are afraid to: everyone is bored by Powerpoint presentations, and yet everyone expects them to be used.

I try to avoid the cursed Office product as much as possible. Sadly, a few of my professors actually require Powerpoint decks for class presentations. Having pity on my classmates, I try to make my presentations as interesting as possible. I have a rule of thumb, and it goes like this: when I consider what I need to include in each slide, I ask myself, if I were making this presentation without the aid of a projector, which visuals would I print out in hard copy because they’re that necessary to understanding the topic? These images, along with a caption or two, are the only things I’ll allow in my slide decks. If it’s not worth spending money to print out, it’s not worth wasting my audience’s time on. If there is something important to say, I think the best thing to do is just say it, and reserve the projector for images that aid understanding.

I think most people do not understand that their slide decks do not have to stand on their own. Instead, they copy half their speech into their slide deck, as though hearing it and reading it at the same time will increase the audience’s attention. This not only takes up more of the audience’s time, but the speaker also wastes more time making the presentation, as the officers quoted in the NYT article did. I think we’d save a lot of time in meetings if people would learn to just say what they wanted to say, instead of writing a storyboard about it.

“I don’t know anything about computers.”

Hello visitors from StumbleUpon! If you like this article, you might be interested in my other popular posts.

One of my biggest pet peeves is when friends say, “I don’t know anything about computers.”

This sentence irks me for a couple reasons. The first is that it is blatantly not true. I’ve met folks who have never touched anything more complicated than a solar-powered calculator. Compared to them, my friends — who use their computers constantly for schoolwork and practically live on Facebook — have considerable technical experience. It is disheartening to hear how little they value their knowledge.

Moreover, the context in which I typically hear friends say, “I don’t know anything about computers,” is as an excuse when their computer does something unexpected, they don’t know what to do, and they would rather back off and let someone else fix it than try to solve the problem on their own. My friends are afraid of their own machines. I think this sentiment is a symptom of ongoing trends in the industry towards a closed-box style of consumer computer design.

Cory Doctorow explains it better than I can in his iPad rant on BoingBoing:

The original Apple ][+ came with schematics for the circuit boards, and birthed a generation of hardware and software hackers who upended the world for the better. If you wanted your kid to grow up to be a confident, entrepreneurial, and firmly in the camp that believes that you should forever be rearranging the world to make it better, you bought her an Apple ][+.

[...] Buying an iPad for your kids isn’t a means of jump-starting the realization that the world is yours to take apart and reassemble; it’s a way of telling your offspring that even changing the batteries is something you have to leave to the professionals.

While Apple’s closed-box style contributes to the ease of use which is the hallmark of Apple’s products, I’m afraid that it is changing consumer attitudes in a negative way. Apple wants to keep the inner workings of their products a secret to the point that they want to make it illegal for consumers to alter the software running on their own property. Preventing users from controlling the software on their own devices is dangerous for several reasons, but it scares me most because it discourages users from learning about their devices. In effect, Apple is profiting from its customers’ ignorance, and the consequence is that more of my friends profess, “I don’t know anything about computers.”

Apple’s products are a timely example, but other manufacturers are guilty too, and I think it’s the generation just now learning about technology that will suffer most for it. Curious kids will never be able to tinker with the insides of their iPads as they could with the Apple][+. I think we as technophiles have a responsibility to kids to pick up the slack. Get your kids a garage sale computer to take apart together. Find out if your teenager’s high school offers programming classes. Donate to or volunteer with groups such as TechBridge, which offers after-school programs in technology and engineering for underprivileged girls in Oakland, CA. But most importantly, make sure kids are not afraid of tinkering with technology. How else can they hope to make it better?

My classmates are taking their notes digitally, but I can’t fathom how they keep up

I noticed today that as I frantically scribbled to keep up with my philosophy professor’s lecture, there was an audible hum of typing in the classroom. It was the first time I noticed that I could count more students using netbooks than notebooks to take notes in class.

Call me old-fashioned, but I like to take notes with a pen and paper. As I’ve discussed previously, the act of writing helps cement the lecture material in my mind better than passive listening does, and studies have shown that it’s not just me [pdf]. Still, I know that my old-fashioned ways are quickly going out of style.

I don’t know if typing notes aids memory as well as taking notes on paper does, but I do know that it does not work for me. I decided at the beginning of last year that it would be nice to bring my laptop to class so that my notes would be neatly organized (and actually legible for once), and changed my mind after only one or two classes. I could never type fast enough to keep up with the professor, and every five minutes I found myself cursing at not being able to copy the diagram on the board. It was a relief to have my Five Stars and Pentel R.S.V.P.s back at the end of that little experiment. Considering my negative experience, I wonder how my classmates can keep up. I know that not everyone learns the same way I do; maybe my peers don’t need notes as copious as mine in order to do well.

If notes are going digital soon anyway, maybe there is a technology that will make up for my ineptitude with typed notes. Tablet computers have been around for years, but I know only one person who uses one in class, and even then she types rather than using the stylus to take written notes. (Maybe Apple’s soon-to-be-announced tablet will bring tablet computers into more common use, the same way the iPhone has with smartphones.) There are also electronic pens which record your written notes for later uploading. I was able to test-write one such pen at MacWorld Expo last year, and it was all right. It would probably mesh well with my way of learning, but I don’t trust myself either to bring one pen to every class or to keep it charged. I’m also not sure if my busy schedule can accommodate the extra step of uploading the notes from the pen to my computer.

Of course, I’m making the assumption that my classmates are actually using their computers to take notes rather than goof off online, which is a huge leap of faith and a different rant entirely. But even though I’m not keeping up with the latest tech trends in note-taking, I’m doing what works best for my learning style, and I’m okay with that.

The non-Kindle experience gets a little better, but still not great

kindleForPCIn one of my first posts, I talked about how it’s great that textbooks are moving to digital formats, but Kindle-ized textbooks still don’t work for me right now. One of my main gripes was that Kindle ebooks couldn’t be read on a computer. Well, Amazon has started to fix that problem by introducing the Kindle for PC application, letting Kindle ebook owners read their material on their computers.

My reaction is still … really? You’re only just now rolling out this technology? It feels like it should have been an obvious piece of software from the beginning. It’s not as though reading an ebook on a computer is a novel idea. Amazon even used to sell a variety of digital books which could be read using either Adobe or Microsoft reader apps. I remember buying a couple books this way in high school, but the files I purchased no longer appear in my “digital locker” on amazon.com. Now, almost digital books for sale on the website are branded for the Kindle. (You can still find some books for sale as PDF’s, but not on nearly as many as you could have a few years ago.) Maybe Amazon hoped that delaying the ability to read its ebooks without a Kindle would encourage more hardware sales. It is, after all, losing money from the sale of most ebooks. However, the Kindle iPhone App has been freely available since March 2009. Make up your mind, Amazon!

It sounds like Amazon wants to restrict its reading platforms as much as possible, but not to the point where it stops them from being competitive. When Barnes & Noble announced its Nook last month, it advertised that both PC and Mac software will be available to read your Nook-branded ebooks on, and I think Amazon saw a need to match this feature. It’s a smart move; I’ll gladly pick the ereader that will let me read my books (which they insist on putting DRM on) on more devices.

While Kindle for PC is free and available to download now, Amazon.com currently lists Kindle for Mac as “coming soon.” For their sake, I hope they finish it before Christmas, or else there will probably be a few more Mac-users opting for the Nook, instead.

The highs and lows of cloud computing

Cloud computing, with services such as Salesforce and Google Mail and Docs, is easily my favorite internet technology. The potential for scalable, affordable services online really excites me, and I definitely plan to enter that sector of industry when I get my degree. But cloud computing is fraught with pitfalls, too, as a few recent data disasters have shown.

Upsides:

  • When my data is in the cloud, I can access it from everywhere. This becomes increasingly important the more devices you have. When I want to see my email from my personal computer, my iPod Touch, netbook, my lab computer, my eReader, and my phone, it’s a lot more convenient to keep that data in the cloud, rather than having to manually sync each device. This property has saved me more than once, too. When I took a train to New York City last spring and found that I had forgotten my ticket confirmation number in the rush to get out the door, I was able to pull it up on a public internet terminal and still catch my train.
  • Cloud data is more secure than local data because it is backed up on someone else’s servers. If my office burns down, I’m still going to be able to access my email, and if the server goes down, there will be a dedicated team to fix the problem.
  • Cloud computing is necessary for software as a service (SAAS) products, which can be very scalable and very profitable. When Salesforce gains a new client, they don’t have to come out and do a complicated database installation or train local IT on how to implement their product on local servers, or even make sure all the users’ terminals have the same operating system. The software is in the cloud and ready to go; all the local users need is a browser to access the database.
  • The cloud has also ushered in an era of free applications such as Google Docs, which not only competes with expensive office suites but also enables easy document sharing: you don’t have to upload your presentation to send to your coworkers if your presentation already exists online. These programs are easy to use because there is no installation, and they’re compatible with almost all computers because they work through a browser.

Downsides:

  • Your data might not be as safe as it sounds. Last month, as Microsoft performed an update on the servers that host data for T-Mobile Sidekick users, something went horribly wrong and all data in the cloud was lost. I don’t own a Sidekick, but I would have been outraged if this happened to me. The worst part is that there really wasn’t anything Sidekick users could do about it. While Microsoft “worked round the clock” to restore the lost information, they couldn’t possibly restore everything. Backups can fail. No server is 100% safe. So while your data might stand a better chance in the cloud, the more backups you have, including local backups, the safer you are.
  • If the company you trust with your data goes down, you might lose it. Yahoo announced in April that it would close its free web-hosting site, Geocities. Last week every Geocities site officially became unavailable. While Yahoo gave plenty of warning in advance, it still hurts to find out that your website, something you consider your property, is going to be shut down no matter what you do. I’m sure plenty of Geocities users never had the chance to save their data. Whenever you upload content to 3rd party servers, you put your data in their hands, and there is always a danger that they will delete it without your permission.
  • The flip side to the argument that 3rd parties will ignore your data is that they will pay attention to your data. Online banking is a form of cloud computing, because the bank offers a virtualized resource as a service over the internet. That’s great, but there is huge pressure on the bank to make  sure I’m the only one who can see that data and manipulate it. Likewise, if I send confidential email, I trust Google Mail not to let its employees or anyone else read it without my permission, but neither I nor they can absolutely guarantee it will never happen. There is always a danger of unsecured data with cloud computing.
  • Cloud applications are primarily accessed through browsers, but browsers vary in terms of what technologies they support. While modern browsers like Firefox and Google Chrome adhere to web standards, the browser that dominates the market, Internet Explorer, sometimes makes its own rules, which web developers spend lots of time and money trying to stay ahead of. SAAS companies take a risk because they cannot guarantee the browser their client uses will be compatible with their software. Even scarier is the idea that Microsoft might decide that it doesn’t like the idea of Google Docs competing with its office suite and makes Internet Explorer incompatible with Google’s product.

So while cloud computing is exciting because of its scalability and versatility, it is also dangerous because it puts personal data into the hands of 3rd parties. I still think, however, that as people start using more and more devices in addition to personal computers on a regular basis, companies that utilize a cloud architecture to deliver their products will be the most successful.

Twitter’s assault on the English language

Twitter is one of those double-edged swords of the internet community. While it can be a powerful tool for good, such when Iranian protesters tweeted their unrest in June, it can also be a tool for evil, judging by the amount of spam Twitter sends out. Still, I think one of the worst crimes Twitter and its users have committed is the way they butcher the English language with spinoffs and spoofs of the word twitter in terms for its users and companies who utilize it. I shall attempt to catalogue the some of the worst offenders:

Words that describe the Twitter service and those who use it: I swear I did not coin any of these terms:

It all started out very innocently. There was a website called Twitter, on which tweeters could post a tweet. Then the service expanded enough that it became a twittersphere, on which twitterites posted their twitworthy musings. When power-users entered the scene, they became the twitterati whose twitticism reached twitical mass. Now if you desire to be a twittizen of the twitterspace, you must follow proper twittiquette, lest you be deemed a twittiot.

It’s getting to the point where the vocabulary of social networking overlaps with that of Elmer Fudd. I’ll admit it was funny when Stephen Colbert made Meredith Vieira lol on the Today Show when she asked him if he used Twitter. He answered, “I have twatted” on national television; I’m sure the NBC producers loved that. And I’m sure if Randall Munroe had drawn XKCD #181 “Interblag” in 2009 instead of 2006, he would have included “tweeto” in his column of internet nickname prefixes. (I blog on the tweetotubes, myself.)

Companies that use Twitter for advertising or part of their service: This was really bad in the days when people didn’t know what Twitter was, and companies had to include plays on tweet or twitter in their names to denote that they utilized the website. There are still some bad ones out there:

  • Best Buy’s @Twelpforce: Best Buy rolled out this service in July of this year when users like @ComcastCares were starting to become popular. They even put out a couple of corny videos to advertise it. No matter how useful they may be, I just can’t forgive them for their name.
  • Twaitter: Twaitter.com is a service that times your tweets for you so that advertisers can reach the widest audience. I keep thinking it has to be a group for out-of-work busboys.
  • Twinester: Twinester.com organizes groups and communities of Twitter users. You wouldn’t know it from their name, though. My first thought was that it was a club for fans of a strong thread or string composed of two or more smaller strands or yarns twisted together. Their logo’s coloring suggests that they intend their name to be pronounced “twi-nest-er,” rather than “twine-ster,” unfortunately.
  • Twetris: Despite the corny title, this flash game turned out to be a decent representation of Tetris, using recent updates organized into blocks of varying sizes and colors. I approve of the game, though not the name.

What Twitter-related names make you cringe? Leave a comment, or (heaven forbid), tweet me about it.

Snow Leopard revisited

mac-pcAccording to the headlines, Snow Leopard is in trouble.

Mac: Hi, I’m a Mac
PC:
And I’m a PC
Mac:
Hi, I’m a Mac
PC:
Are you OK?
Mac:
Where am I? Who are you?

Such was reddit.com‘s top-rated comment yesterday when word of a major bug in Snow Leopard got out. Apparently the operating system has a bug that can delete all user data if someone logs into a guest account, then back into their regular account. The BBC and several other sites today reported that Apple has acknowledged the bug and is working on a solution. They advise in the meantime to delete any old guest accounts and only use native Snow Leopard guest accounts if necessary, which suggests that the bug comes from a problem in the upgrade process for guest accounts native to Leopard (or possibly even earlier).

I suppose I could count this as another reason not to upgrade to Snow Leopard, but it doesn’t seem to be as big of a problem as the media would have us believe. This problem obviously doesn’t happen every time someone uses a guest account in Snow Leopard, or it would have been reported much earlier, considering the operating system was released in August. Another good reason to upgrade is that my assumption about pricing turned out to be incorrect. While Apple’s official story is that to upgrade from Tiger you have to buy the $169 box set, quite a few sources have reported that upgrading using the $29 package directly from Tiger works just fine. I even have a couple friends who have done it with no problems. As for me, I’m still holding back because I’m running a few legacy programs that I need for classes that I couldn’t get by without. But when I do upgrade, I will make damn sure to delete any upgraded guest accounts!

Cool airport stuff found

While on my way home from my trip to the Grace Hopper Conference in Tucson this year, I got to spend plenty of time in a few different airports. While in Dallas on a layover, I saw a couple uses of technology that I thought were particularly clever.

ipod_vending_machine• Gadget Vending Machines: I know these devices are not new. I’ve gotten used to seeing iPod vending machines in shopping malls; my local Macy’s has at least one. I had never understood the appeal, though. Gadgets costing over $100 tend not to be spur-of-the-moment purchases, so why would anyone buy an iPod or a digital camera from a vending machine? I’m sure better deals can be had online. When I saw one of these machines in the airport, however, it suddenly all made sense. Considering how easy it is to lose your gadgets when schlepping through security and such, an airport is one of the few places when you might suddenly decide that you needs a new digital camera or high-end pair of headphones. (I’m not sure about the iPods, because a factory-fresh iPod wouldn’t have any music on it, making it less than useful as entertainment on a plane.) The airport creates the perfect environment of hectic transportation and emergency purchases to support these machines, and I’d never thought of that before.

Ad-Supported Public Internet: As I walked down the terminal, lamenting the lack of free wifi, I passed a kiosk offering free public internet. I thought this was strange, considering that when customers are trapped in a closed environment, like an airport (or a plane, for that matter), they usually have to pay through the nose for basics like food and internet. Intrigued, I took a closer look. It turned out this kiosk did indeed offer free internet access, and it prompted the customer to click on one of three ads on the screen to continue. It turned out that clicking on an ad started a short video, and after that, there was internet access. I didn’t test out the machine much farther than that, because who knows what kind of tracking software could have been installed. Still, I’m a fan of ad-supported services (Gmail, anyone?), and I think it’s a step in the right direction for airports to offer ad-based services rather than the digital equivalent of the $10 ham sandwich.