Aucbvax.2389 fa.works utzoo!decvax!ucbvax!works Tue Jul 21 16:23:29 1981 Collected responses on terminal input devices >From WorkS-REQUEST@MIT-AI Tue Jul 21 16:04:02 1981 ------------------------------ Date: 20 Jul 1981 12:09 PDT From: Kimball at PARC-MAXC Subject: Terminal Input Devices In-reply-to: WorkS-REQUEST's message of 20 Jul 1981 08:00-EDT To: WorkS at MIT-AI, ALBrown Speaking of soft keyboards, I'm surprised no one has mentioned an old idea that has been kicking around 5 or 10 years: an image of the keytops can be generated on the display and then reflected off a half silvered mirror that is mounted over the keyboard. The user can see the keys (even when his hands are over them!) with whatever labelling he desires, switched at electronic speed. Furthermore the geometry is such that the user doesn't have to be exactly "on axis" to see the desired image. Of course, there are some drawbacks, but none of them seem to be showstoppers: 1) a lot of expensive resources (e.g. bitmapped display & memory) are given up to support the keyboard image. Also the image on the screen surface itself is upside down; 2) the glass "shield" over the keyboard sounds awkward; 3) I wonder whether screen curvature, raster blooming, and the like would make it hard for the keytop images to be precisely aligned with the physical keyboard. Ralph Kimball P.S. Allen Brown tells me that this concept was explored by someone in IBM on behalf of J. C. R. Licklider (Licklider @ MIT-XX). Forgive me if this is a garbled pointer. ------------------------------ Date: 20 Jul 1981 1319-EDT From: Bob Hyman To: SHRAGE at WHARTON-10, works at MIT-AI Subject: Re: Interchangable keyboards In-reply-to: Message from SHRAGE at WHARTON-10 (Jeffrey Shrager) of 18-Jul-81 0641-EDT At an NCC a while ago, I saw a terminal with a dynamically lableable keyboard. The keys were arranged in a 10 x 50 matrix, and had transparent tops. There was a mechanical (air-driven, I believe) sheet feeder that could slide any one of about 10 different layouts under the key matrix. The particular layout was selected by function keys off to one side, and it took about 1/10 sec. to switch, accompanied by some hissing and clunking. It was not an entirely unworkable arrangement. ------------------------------ Date: 20 Jul 1981 1218-PDT From: Steve Klein Subject: QWERTY space bar To: WorkS at MIT-AI If the RETURN key is in the wrong place and the full-length SPACE bar is a waste, why not split the SPACE bar and use the left thumb for RETURN? One would think this would not cause too much trauma either for manufacturers or users. ------------------------------ Date: 21 Jul 1981 00:39:45-PDT From: decvax!duke!unc!smb at Berkeley To: WorkS at MIT-AI Subject: keyboard tracking system Cc: duke!shg@Berkeley This note was sent to me; I thought I'd pass it on >From duke!shg Mon Jul 20 09:34:50 1981 Date: Mon Jul 20 09:33:20 1981 I saw your note about a keyboard tracking system. It seems to me that the most convenient position for a cursor control setup is just below the space bar on the keyboard. A small trackball or joystick modified (or even a two- dimensional slide switch) could be easily manipulated by either thumb without moving the fingers from the keyboard. I find that I always use my right thumb for spacing, thus I guess with a little practice I could use my left thumb for cursor control EVEN WHILE TYPING. ------------------------------ Date: 20 Jul 1981 0838-PDT From: WMartin at Office-3 (Will Martin) Subject: Keyboards To: works at MIT-AI Message-ID: <[OFFICE-3]20-JUL-81 08:38:32.WMARTIN> Keyboards: There was a LONG series of discussions on Human-Nets some time back about Dvorak keyboards. If there are people on this list who weren't exposed to that, maybe somebody with an MIT account could run an editor through the HN archives and come up with a consolidated file for FTPing of that exchange. Would be appropriate as the list seems to now be covering alternatives to the standard keyboard, and Dvorak has lots of supporting data which was outlined in that discussion. [ A transcript of the HUMAN-NETS discussion on keyboards is available in the file DUFFEY;WORKS KEYBRD on MIT-AI. -- RDD ] ------------------------------ Date: 21 July 1981 02:12-EDT From: Marvin Minsky Sender: MINSK0 at MIT-AI Subject: pointing devices To: MINSKY at MIT-AI, WORKS at MIT-AI, norman at NPRDC I agree with Donald Norman about re-examining keyboards. I wasn't concerned with keeping hands on keyboard, because I once learned some American Sign Language (ASL) and saw that sign-language works quite well and could be quite fast -- provided the intelligent observing machine can keep up. One learns a large lexicon of special words and symbols in ASL and, when these fail one uses "finger-spelling". The latter is lots slower than expert typing, to be sure. But this is because one has to reconfigure the whole hand for each letter; the vision machine could sense smaller finger changes than a person could, I think. Then we could adopt Norman's idea of using bidirectional finger motions, and little "chords", etc. In the end it should be faster than typing. Gloves and rings and things might do, but I think AI will get around to making good seeing machines eventually, and they'll do so many things that they'll be cheap. In the end, there will be two or three of them inside the average typewriter, just watching for paper jams and ribbon problems. After a while, people will find that they don't need many of the machines that the vision boxes were made to keep an eye on. ------------------------------ Date: 20 July 1981 1222-EDT (Monday) From: Hans Moravec at CMU-10A (R110HM60) To: WorkS at mit-ai Subject: Gloves Along with the keyboard gloves you get a head-mounted binocular display, as in the old Utah 3D system. Now you can not only move your head from side to side to reveal obscured pages, but can walk around your workspace and view it from behind or underneath. If you're into such, the entire workspace can be mapped onto the surface of your real desk, and there can be simulated piles of paper that look like the real thing! To focus your attention on one, just move your head closer to it. If the head mounted display carries outward looking cameras that can track your fingers (and microphone and earphones), you could pick up and shuffle the simulated paper. In the long run all this stuff should be integrable into an eyeglasses frame. It needs some kind of intertial or other navigation system to make sure it knows where your head is to generate the appro- priate view. With a radio link to a communication system and a shaving mirror it could be used as a videophone. Or a cheap telepresence terminal. Or a syntha-presence unit; Imagine the adventure display possible when you can walk around the scenes in 3D (need a lot of crunch power for this, but much more practical than some "holographic" methods suggested by Niven). Better watch your icons, though! ------------------------------ Date: 20 Jul 1981 (Monday) 1804-EDT From: DREIFU at WHARTON-10 (Henry Dreifus) Subject: In response to "gloves" To: works at MIT-AI It was suggested to my by Saul Levy of Bell Telephone Labs, (as not to implicate myself) to use Teflon Boots that someone puts their feet into, as not to have to remove one's hands from the keyboard when typing. I leave this as a comment nothing more. Hank ------------------------------ Date: 20 July 1981 1056-EDT From: David Smith at CMU-10A Subject: Pointing devices To: WorkS @ mit-ai In the summer of '78, I saw a demo at SRI of a device which could tell where your eyeballs were pointed. It used internal reflections in the lens. People were writing their names with it. The writing was rather jerky, because the eyeballs move in saccades. If your work station had one of those, plus a speech (word) recognizer, you wouldn't have to remove your hands from the keyboard to designate an icon. Lacking a speech recognizer, you could type escape-footpedal-foo, but that lacks class. ------------------------------ Date: 20 Jul 1981 1351-PDT From: Kelley at OFFICE Subject: The Back Split Twist Keyboard To: works at MIT-MC Take the Maltron contoured keyboard. Chop it in half down the middle. Put mice wheels under each half. Pick the portion of the desktop you are viewing on the screen with one half. Pick entities on the screen with the other half. No need for your hands to leave the keyboard. Engineer a little to keep the keyboard stationary while you type. Now. Take a flat display screen that fills one whole surface of a box about the size of the Whole Earth Catalog. Put your processor in the box. On the back, place each half of the keyboard twisted so you are holding the book while you type. Control wheels / track ball on the side with the thumb / palm of your hand. Control your dynabook with your back split twist keyboard while you walk the earth. -- kirk ------------------------------ Date: 20 Jul 1981 (Monday) 1935-EST From: STECKEL at HARV-10 Subject: recommended reading To: WorkS at MIT-AI Seeing the flames and flak fly freely the last few weeks, I would strongly recommend all participants to read the issue of the ACM Computing Surveys Vol 13 no. 1 of March, 1981. It addresses "human factors in computing". Especially of interest are the article on editors and Beau Sheil's article. Aside, I would suggest that the ideal "terminal" look like a pad of paper (flat screen display), with a keyboard on the lower 1/3 or so... g steckel ------------------------------ End of collected responses on terminal input devices **************************************************** ----------------------------------------------------------------- gopher://quux.org/ conversion by John Goerzen of http://communication.ucsd.edu/A-News/ This Usenet Oldnews Archive article may be copied and distributed freely, provided: 1. There is no money collected for the text(s) of the articles. 2. The following notice remains appended to each copy: The Usenet Oldnews Archive: Compilation Copyright (C) 1981, 1996 Bruce Jones, Henry Spencer, David Wiseman.