Category: hci

Gargoyle Cowboy

posted by ben on 11.07.02 at 10:56, null, null, hci, 1 comment Permalink

I just read Accelerando by Charlie Stross and am having a bit of a cyber utopian moment. Apparently this is well timed as I just got a call from a friend in London claiming the venture cap is flowing like it's 1999.

One of the three protagonists, Manfred, builds up a custom wearable from random bits and pieces to augment his memory. I'm wondering if technology has gotten to the point where it's finally possible to do this. I've made a few fitful attempts in the past but a combination of crap technology and my cheapness made it impossible.

I'm wondering if goggles/glasses are not the way to go. They'd be great if there were COTS HUDS but those don't exist. What if instead you had a number of small displays, much like a cross between a trader's workstation and the numerous magnifying glasses watchmakers used to mount on their heads. You could move them around and still have some semblance of interaction with reality even if no one would talk with you.

So, I'm thinking:

(1) Cowboy Hat (maybe a top hat?)
(2) Drill a couple holes in it and mount some little displays. Something like these Of course it looks like these have some awful interface that is VGA but not through a VGA plug, so there's probably a lot of work involved in getting this going.
(3) Now you need to provide computer power for the thing. Maybe a stripped netbook with VGA out and a three way splitter? Maybe the innards of a full laptop? I'm a little stuck here.
(4) Then there's the question of battery life. This really needs to have enough juice to go for a day.
(5) Net connectivity is easy. Just jam a USB 3G card in the top and you're good to go.
(6) Input used to be a nasty problem. Now I'm just thinking bluetooth keyboard and mouse. Maybe those pretty Apple ones. It'd be nice if there were something you could use while wandering around but the convenience of a real input device probably outweighs this.

I really like the idea of building this all into a 5lb hat. It might be a little heavy, but it'd be some much better than dealing with a million cables.

Comment from: ben [Member] ·
It looks like putting together what I'm imagining would cost something like $10,000. I suppose I'll wait another 5 years and look again.

I'm thinking about buying a Lenovo X220T to replace my work computer. It seems like it would be a good ubiquitous thing to cart around the country.
Permalink 07/03/11 @ 13:32

Wacky google error

posted by ben on 08.05.14 at 03:43, null, hci, Leave a comment Permalink

Which cult should I join?

posted by ben on 06.10.19 at 20:09, null, hci, Leave a comment Permalink

Is Ruby on Rails the divine thing that these people and everyone they interview believe it is? I don't want to be missing the boat, though I suppose if Ruby is the path to salvation I already have. They sound a lot like the people who get all excited about Python and you can probably guess what I think of them.

This has a particular relevance as I'm about to embark on a potentially gigantic development project. It's been suggested I use JavaServer Faces which seems more reasonable since it's backed by a real company and not a bunch of tech dorks.

PHP is easy, but doesn't scale nearly as well as Java. I suspect Ruby has the same problems. Java makes me happy. Or I could write a bunch of sh and C cgi to prove to everyone how indie I am.

Also: Exim, Sendmail or something else?

And I assume MySQL is the clear SQL winner? Dissenting opions?


the one (or i 1z l33t h@x0r)

posted by ben on 06.10.13 at 20:44, null, hci, 2 comments Permalink

Remember when The Matrix came out and everyone screamed about how stupid it was that hackers were depicted as spending their days staring at binary streaming by? Well, guess what I'm doing tonight...

Comment from: anwar [Member] ·
try not to do it for too long...

being able to add/multiply in hex is not a skill you want taking up precious brain space.

I never did get the hang of hex division though.
Permalink 10/16/06 @ 21:38
Comment from: ben [Member] ·
sadly, it's exactly what would have helped at a recent interview.
Permalink 10/16/06 @ 23:43


posted by ben on 06.08.30 at 15:16, null, hci, Leave a comment Permalink

Manually defining and using PInvoke signatures (also known as Declare statements in VB) is an error-prone process that can introduce extremely subtle bugs. The rules are complex, and if you make a mistake, you’ll probably corrupt memory.

for devin

posted by ben on 06.08.11 at 04:26, null, hci, technology, 2 comments Permalink

If you look at the much larger version of this image, toward the middle, you'll see a MSVC compile going. The cool part is that it's compiling two targets at once... notice the 1's and 2's. And the CPU usage is at 99%, so you can be reasonably sure it's spreading the work out evenly between processors.

Everyone needs at least 2 cores. I need 16.

Comment from: devin [Member] ·
You should be able to fully utilize your dual core goodness with a single target compile. Compiling C is one of those embarrassingly parallel problems -- with little or no work on the compiler engineer's part you can could run as many parallel execution paths as there are C translation units. Because of I/O latency, the recommendation is to run even more parallel compiles than you have cores (although now that we have pre-compiled headers, there should be much less I/O going on). Typically you should run 2-4 compile processes per core. (This is why on a lot of unix systems you'll see make aliased to 'make -j 2', or 'make -j 4'.)

An engineer here at WWDC mentioned that on their quad-core machines they've determined that 14 parallel compilation processes is the sweet spot. (But is worth mentioning that even with network-scale latency, you can distribute C compiles across 64 machines before you reach the point of diminishing returns.)

If only it were so easy to parallelize UI code.

Also: a three monitor screenshot - that's so fucking cool. Every one should have at least two cores and three monitors.

Permalink 08/11/06 @ 09:22
Comment from: ben [Member] ·
I stand corrected. I need 64.
Permalink 08/11/06 @ 15:07

posted by ben on 06.07.24 at 01:32, null, hci, 7 comments Permalink

How do you create a product that provides encyption and anonymity to everyone, even my grandmother? How do you profit from that product?

Comment from: collin [Member] ·
Jesus, it took me way too long to figure out what the "hci" category meant. [Mmmm, category, objects and maps...]

Are we talking about at software product? Encryption and anonymity for transmitted data? Encryption for local data?

As for profit, you simply prey upon people's fear.
Permalink 07/24/06 @ 09:45
Comment from: ben [Member] ·
I've been thinking about transmitting things, really everything.

Yes, there are methods to do this, and yes, they do prey upon people's fear, but they are such a pain to use that only the truly paranoid bother with them.

The technical problems aren't the issue. I simply trying to think what a useable interface would look like.
Permalink 07/24/06 @ 15:02
Comment from: ben [Member] ·
what else does HCI mean? I'm still not sure who made that category... Devin maybe?
Permalink 07/24/06 @ 16:15
Comment from: devin [Member] ·
I'm still not sure who made that category... Devin maybe?

No, it wasn't I. I have a policy of posting under the coffee category because I think having categories on nonplatonic is stupid.

Using my intrepid link clicking skills I have discovered that it was, in fact, Scott who made the HCI category.
Permalink 07/24/06 @ 16:42
Comment from: collin [Member] ·
Human computer interface, is what I was guessing.
Permalink 07/24/06 @ 19:41
Comment from: ben [Member] ·
oh, I was hoping for something cool:

hydro collector interlocutor
hamster catydid imaging
hell-carolina integrator
Permalink 07/24/06 @ 20:03
Comment from: scott [Member] ·
HCI = human-computer interaction

i.e. the field that studies computer interfaces.
Permalink 07/25/06 @ 13:48

posted by ben on 06.06.27 at 01:39, null, hci, Leave a comment Permalink

Dear interwebs,

Is google working on google current maps? If not, why not?

posted by ben on 05.10.24 at 20:50, null, hci, 2 comments Permalink

Graham had an idea I liked. Write a program to save playlists. Load them into iTunes, Winamp, etc. and they play the songs you have on the list. If you happen to be missing a song, it just gets skipped.

I know you could create a lot of different moods by picking from my music collection, but I never get anything coherent since it's always on shuffle. And don't start about how I should listen to every album the whole way through...

Comment from: collin [Member] ·
How is this different than, say, the playlists in iTunes now? Do you want something that just creates a playlist from all songs out there, and then iTunes or whatever just plays the ones you have?
Permalink 10/27/05 @ 20:30
Comment from: graham [Member] ·
I don't know what iTunes does, because i scoff in the general direction of everything with an "i" in front of its name. Also, iTunes sucks and I hate it.

I'm not suggesting attaching an ID number to every song, though that might actually be fine considering all the crap stored in ID3 tags now, but rather just searching for the song on the computer that most closely matches the song in the playlist. The reasoning for this is there are still no common file naming, tagging, or directory organizing conventions, and thanks to gnutella or whatever everybody used to get their music, we often know next to nothing about the song we have.

And while I'm on this, I'm also a big fan of things (I don't know what they're called but I'm sure there's some name for them) like Amazon's "Customers who bought this title also bought:" lists. If you could train a computer with playlists you make yourself to generate new playlists when given a few tracks to start with, that could be the coolest thing ever.
Permalink 10/28/05 @ 19:45

You can choose to be assimilated.

posted by ben on 05.07.03 at 21:29, hci, rant, Leave a comment Permalink

There's a discussion (if it can be called that) going on over at slashdot about some comment Gates made. Apparently he doesn't want to be a cyborg. The slashdot crowd is reading this as fear buggy windows software will crash people. No one is thinking about the much simpler explanation: maybe Gates likes his humanity just the way it is. Things are going pretty well for him, so he probably doesn't feel the need to plug in and drop out. Hell... maybe augmented reality won't make supermen, just more mockeries of human nature crouching in neon lit boxes.

I must fight the urge to look at the comments. I foolishly tried to comment, but my slashdot account (from maybe 5 years ago) seems to be gone. It's sad too, I bet I had a respectable ID.

Data Mining in Strangely Biased Data

posted by ben on 05.06.11 at 02:03, null, hci, 1 comment Permalink

The chart shows lift greater than one for the commonly used screen resolutions such as 1280x1024, 1024x768, and 800x600 implying that visits with these resolutions tended to search more than the average visit. The resolution 640x480 has lift less than one. The reason for this is interesting. We found that when the screen resolution was set to 640x480, the search button disappeared past the right edge of the browser screen. In order to access the search button, one would have to scroll to the right, which explains why so few visits with that resolution performed a search.

Current bot filtering is mostly based on a combination of a continuously tuned set of heuristics and manual labeling. It is worth mentioning that page tagging methods of clickstream collection (Madsen, 2002), which execute blocks of javascript at the client’s browser and log the statistics returned by the javascript at a server, avoid bots because they require the execution of javascript, which bots rarely execute. However, people who do not have javascript turned on in their browsers or who click on a link before the javascript code can download and execute will not have their visits correctly logged by page tagging systems. These visits can amount to about 5% of all human visits, thereby resulting in inaccurate clickstream statistics.

Begin Analytic Rant:

The 640 thing is just funny. I wonder how much business would be hurt by dropping anyone at 640... I imagine they not only make up an increasingly pathetic portion of browsers, but a more pathetic portion of buyers... anyone who can't afford a post 1990 monitor is probably in an ugly financial situation anyway...

The bot thing strikes me as a little more insidious. Surely google should be indexing everything... and you don't want to try to break google, only help it. This is an enlightened self interest thing. All hail the ever increasing google rank...

Beyond that, is it really that bad to have other sites index your content and link directly to it? I guess it depends on your business model... but if you sell things, do you really care how a customer got to a page if they're planning on buying? Worst case someone comes up with a better interface to your crappy site (I'm visualizing pricewatch indexing newegg)... There are probably certain customers who come to newegg that wouldn't have otherwise due to pricewatch. But, by providing a better interface, pricewatch diminishes brand loyalty to newegg.

To make it clearer... The ideal scenario for newegg would be two fold: Customers would browse their site alone because the prices and interface are good (hah). Second, new customers would come to newegg and stay because the interface is superior to that of pricewatch (hah again).

Misdirecting/Denying indexing and price listing bots only creates market friction. I can't imagine how this friction is beneficial, but there are scenarios where it hurts. The most obvious is when it prevents outside parties from fixing a stupid UI for free.

Also, Pedro Domingos is cool.

Comment from: Other Graham [Visitor] ·
something interesting along these lines:
Permalink 06/11/05 @ 09:20

The Future

posted by ben on 05.06.06 at 15:18, hci, rant, 1 comment Permalink

Code gets ever more abstract. blog nonplatonic=new blog(authors, archives); Distinctions between individual programming languages, daemons, boxes... all of it continues to blur.

Humans don't see things like ls anymore. Metadata disappears under layers of abstraction. Metadata ceases to be divided into categories understandable to humans. Clustering, fuzzy clustering and finally something with an actual statistical motivation takes over.

Computers learn to classify anything. Images and video become the stuff of the internet all indexed by machine. Words go the way of books while VR makes a comeback... implemented in some portable format... not unlike VRML Pretty portable GUIs for everything...

The machine becomes increasingly transparent... until we go online and see other people, not websites. Something analogous to the transistion from telegrams to cellphones.

The human race becomes extinct because we can no longer relate to one another in person, but spend all our time surrounded by LCDs and laser retina display things. That, or nitrogen content in the soil goes to zero and we all starve... Ninja space monkeys inherit the earth.

Comment from: collin [Member] ·
In the future all computers will use Intel processors.
Permalink 06/06/05 @ 18:21

Bug me no more

posted by graham on 05.06.01 at 14:15, Rants, Raves, Missleaneus, Photos, null, coffee, math, books, news, Ideas, random, hci, music, politics, art, movies, technology, 3 comments Permalink

While getting a user/pass from bugmenot, i noticed their "NY Times recommends bugmenot" link.

I especially like the bit about stuffing return envelopes with pieces of sheetmetal.

Also, while I'm strongly in favor of the short/tall/grande system, the Lincoln Park shirt sounds pretty amusing.

Comment from: ben [Member] ·
And Venti? What is there to be said in favor of the Starbucks system?
Permalink 06/01/05 @ 20:19
Comment from: graham [Member] ·
Short = 8 oz
Tall = 12 oz
Grande = 16 oz
Venti = doesn't matter because only starbucks uses it

The beauty of all this is that I can go anywhere in Seattle or to most any Starbucks on the planet and order a tall latte knowing what size latte I am receiving. Whenever I go somewhere with "small" or "large" lattes, how much am I getting? I have no idea! Medium? Fahgettaboutit!
Short, tall and grande quantify the size without making ordering the drink sound like a math problem.
It's like with gasoline... you don't ask for gas by the octane rating, you say regular, plus, or supreme/premium.

Don't make me start on the "Pitcher of Mocha."
Permalink 06/02/05 @ 00:30
Comment from: ben [Member] ·
Explain to me why this doesn't make more sense... Oh God I suck.

Small = 8 oz
Medium = 12 oz
Large = 16 oz
XL = 20 oz
XXL = 24 oz
Permalink 06/02/05 @ 04:09

input focus in OS X

posted by scott on 05.04.20 at 01:37, Catch-all, hci, 1 comment Permalink

Hey, does anyone (and by anyone, I mean Devin) know how to configure OS X so that a window that is partially underneath another window has the input focus?

Comment from: devin [Member] ·
Scott, to the best of my knowledge (and a quick search on google) there is no way to do this for all apps.

There is a third-party app, CodeTek Virtual Desktop which may do what you want, among many other things.

You can, however, get so-called "sloppy-focus" behavior in by executing:

defaults write FocusFollowsMouse -string YES

at a terminal prompt and restarting Obviously, you can turn it off by replacing YES with NO. Note, this works even if the front-most app is note is not Terminal, but in this case, you get "focus-follows-mouse" semantics, instead.

You can also get this behaviour in the X windows window manager with:

defaults write wm_ffm true

(Note: I haven't verified this for X Windows).

It's worth noting that this doesn't play too well, UI-wise, with the Mac's notion of a single visible menu bar. For example, if you have Safari as your front-most app, but have the mouse over a terminal window, and hit Apple-T, Terminal puts up it's font panel. Because Safari's menu is visible, you would expect Apple-T to put up a new tab, but this is not the case.

Permalink 04/20/05 @ 16:11

"Now he's in DOS. Oh my God."

posted by scott on 05.04.20 at 01:23, Catch-all, hci, Leave a comment Permalink

Said by a friend upon seeing me type into the black-and-white window of Emacs (under OS X) while hanging out in a neighbor's apartment.