The British Isle Venn
Posted on June 9th, 2009 at 3:25 pm by necco

Since we’re posting Venn diagrams… this one came to mind. I’m aware that it may spark controversy, but how else can you represent it so clearly?

From the author: “The Republic of Ireland and the United Kingdom are the only two sovereign states in this image. They are shown in red. Ireland and Great Britain are both islands and are shown in green. England, Scotland, Wales and Northern Ireland are constituent countries of the United Kingdom and are shown in orange.”

Oh… and technically it’s really an Euler diagram because it shows relationships other than pure intersections.

Hard N Phirm: WTF?
Posted on June 2nd, 2009 at 12:00 pm by Steve

You know something’s up when their web page flashes a logo like that.

It gets worse: they very carefully rip off the WGBH program tag and the “Zoom” look in the below, twisted, video about Pi:

All hail Wolfram Alpha!
Posted on May 20th, 2009 at 1:26 pm by Daniel

My dad just sent me a demo for jaw-dropping “computational knowledge engine,” Wolfram Alpha, which went live this past weekend. You put in a search term, which could be a mathematical formula, two cities, whatever, and it analyzes the term and its context and returns related statistics, plots a curve, gives you a map, shows mortalitiy rates, whatever! Off the hook!

The design reminds me of a suggested contextual search engine design from Tufte I remember reading somewhere…anybody remember?

moon-position1
Check the demo, it’s a must see.

Wolfram Alpha demo

Imagining the Tenth Dimension
Posted on August 14th, 2008 at 5:20 pm by dr.hoo



Imagining the Tenth Dimension: A new way of thinking about time and space, a book by Rob Bryanton. www.tenthdimension.com

OMG this is totally fun.
Posted on August 11th, 2008 at 12:41 pm by Mutt

My friend Jay Schuster just pointed me to a really fun graphical description language, which can produce beautiful images with remarkably small amounts of code.

I encourage you to check out the galleries at contextfreeart.org and try it yourself if you’re inclined.

To give you a taste of what it’s like, this snippet of code:

startshape SeedOfLife
background { b -1 }

rule SeedOfLife {
DotCircle {}
6 * {rotate 60} DotCircle { y 1 }
}

rule DotCircle {
180 * {rotate 2} CIRCLE { y 1 s 0.025 hue 120 sat 1 brightness 1}
}

… produces this image:

The Seed of Life, as rendered by Context Free Art

Daily Speculations & Bo Keely
Posted on February 13th, 2008 at 5:18 pm by necco

 I’ve been wanting to share this blog with you all for a while.  I’ve been reading it for the past four months or so and have found it to be continually interesting.  It’s a blog run by Victor Niederhoffer, a semi-famous Wall Street trader who was written up in the New Yorker recently (which is how I found out about him).  Wikipedia says this about him:

Victor Niederhoffer, a well known hedge fund manager, champion squash player and statistician, studied statistics and economics at Harvard University (B.A. 1964) and the University of Chicago(Ph.D. 1969). He was a finance professor at the University of California, Berkeley (19671972).

He’s lost all his money (and his clients’ money) two times, but has recovered both times.  People aren’t sure what to think about him.  His blog primarily offers insights into how any natural phenomenom relates to the market, but you’ll find just about anything there.

http://www.dailyspeculations.com/

One of his best friends is this guy named Bo Keely who has had a life so incredible that he almost seems mythical to me.  Although of the pedigree, he lives his life as the antithesis of a Wall Street trader (literally in a hole in the desert).  Check out one of his posts on his life and his blog:

http://www.dailyspeculations.com/wordpress/?p=2445

http://www.dailyspeculations.com/keeley/hobo_index.html

Is Noise Information?
Posted on January 14th, 2008 at 7:59 pm by necco

Claude E. Shannon, founder of modern information theory, stated that it was in his 1948 paper named A Mathematical Theory of Communication. He stated that the amount of information entropy in a signal is equal to -∑p(x)log[p(x)] over the set X where p(x) is the probability that the message x out of X possible messages was received.Some just think it’s noise because they don’t want to listen.