Monday, December 29, 2008

Optical illusions

First, I advise the curious reader to listen to this talk by Al Seckel, and then take a look at the wonderful set maintained by Michael Bach. Read the fineprints below the illusions! That's the most interesting part - the references to the research papers.

Now, when you've done with all that - I wanted to show something quite powerful that I've found on the interwebs- something that can transform the brain's perception of the very world around you.

What to do: follow the link, maximize the flash window to take up the entire screen, and stare at the center for a couple of minutes. And then, before beginning to actively blink, look around and watch what happens to the world. So, open the flash here.

Sunday, December 28, 2008

Can party

I've been watching those suspicious empty cans for a while, and finally thought that they've gathered enough interactions to be worth capturing.

party

The One

In the other post I was writing about the earthlings as the whole more and more forming the single entity. Kevin Kelly has a different view on what I think is the same thing - how the whole world will become "The One".

A giant entity - call it a machine or the organism - because I believe that the differences are slim, if any. An entity which will be a medium, a substance, in which the new, higher-order forms of being will emerge - with its basic building block being the "temes" - a term coined by Susan Blackmore.

Pretty fascinating.

Particularly interesting are the predicted "total transparency" - something that is happening at accelerated pace, whether one likes it or not. It's a change - which, regardless whether we like it or not, it is probably going to happen.

On the other hand, this "total transparency", when transposed from individuals onto societal structures, I think will cause a cardinal shift in the way we are. Of course, there will be the glitches on the way - but I think this links very well with another idea of mine - that eventually we come to a state where everything about everyone will be accessible to anyone else, and all those accesses will be registered as events in themselves.

When you are in the crowd, and someone looks at you - you notice. The same will happen, but without any spatial boundaries. An explosion of the information.

Subconscious vision

A very interesting article. One can "see" without consciously realising it - person without the visual lobes was able to navigate through the path with obstacles.

Supposing that when LSD allows to see the sounds, it's the visual cortex that is temporarily mis-wired.

hmm... Though, here they they say that the brainscans with LSD did not reveal the increased activity in the visual cortex... I wonder if there's any scientific literature on the topic.

Anyway, a step further - one consciously analyses the information seen. Could one "see" by creating within themselves the illusion that they see, at a higher level of abstraction, as soon as they genuinely believe in it ? And how to make that in a predictable manner ?

Saturday, December 27, 2008

Parsing "real" C code in ruby using CAST.

CAST - parser for C written in Ruby is a pretty nice piece of machinery. One drawback it has - it does not have the preprocessor. And it chokes on "#" origin file marks left by CPP. So, as a first step to making it more fun for real-world use, I made it more user-friendly when it comes to parsing the CPP-digested files.

The algorithm (if I may call it like that) is dead simple: read the lines from CPP-digested file,
and create a "compressed" content, with empty lines and #-lines removed. Also need to do some gsub()-style replacement of some gcc-isms before stuffing them into the compressed content. For each line of compressed content, record the original line#, as well as track the pre-CPP filename and line#. When/if CAST barfs, catch the exception and parse the line# from the message. Then lookup in the info that you've generated previously - and generate your own exception, this time supplying the miscellaneous goodies like the pre-processed file and its line number.

With the resulting wrapper parser it seems like I can parse a lot of my personally written sources. The cool part is that it catches where I have a bit "stretched" the gcc's kindness - cast's parser is stricter than GCC. I'm not sure if it is "pure C99", but it's a good thing to have something like this.

The only tiny gotcha I got caught with while coding this little 80-line piece: line number == array index + 1. I named one variable "linenumber" whereas really it was counting the array index - so indeed shot myself in the foot a bit later. Sometimes I wonder why don't we all start counting from zero, really.

Update: here's the ruby hack that allows the CAST to not barf on the preprocessed file. Again - the "real_code.c" is the file *after* the cpp processing. One could write their own preprocessor indeed, but I just reused cpp.


require 'rubygems'
require 'cast'


#
# "prepare" a line - strip any of the gcc internal stuff from it,
# and any other things we do not understand.
#

def prepare_line(str)
line = str.strip
line = line.gsub(/\(\(__.*\)\)/, "").gsub(/__(attribute|extension)__/, "")
line = line.gsub(/__const /, "const ")
line = line.gsub(/__restrict /, "")
line = line.gsub(/__asm__\s*\(.*\)/, "")
line
end

#
# "prepare" the preprocessed output in array "lines"
# taken from file fname (filename used only for error messages)
#
def prepare_lines(fname, lines)
# blob that will be parseable by CAST
real_text = Array.new
# info about the lines in the blob
real_info = Array.new
original_file = fname
original_lineno = 1
lines.each_with_index do |line, idx|
real_line = prepare_line(line)

if real_line =~ /^#\s+(\d+)\s+"([^"]+)"((?:\s+.*)?)$/
num = $1.to_i
fn = $2
misc = $3
original_file = fn
original_lineno = num # minus something ?
# puts "== set #{original_file}:#{original_lineno}"
next
else
if real_line =~ /^#/
raise real_line
end
end
unless real_line =~ /^$/
real_text << real_line
real_info << { :file => original_file, :lineno => original_lineno,
:debug_file => fname, :debug_lineno => idx+1 }
# puts "#{original_file}:#{original_lineno}:#{real_line}"
end

original_lineno += 1
end
[ real_text, real_info ]
end
def parse_c(fname)
lines = File.open(fname).read.split("\n")

text, info = prepare_lines(fname, lines)
blobtext = text.join("\n")

parser = C::Parser.new
parser.type_names << "__builtin_va_list"
parser.type_names << "double"
tree = nil

begin
print "Parse start...\n"
tree = parser.parse(blobtext)
print "Parse end...\n"
rescue Exception => e
puts "Got exception: #{e.inspect}"
if e.message =~ /^(\d+):(.*)$/
errline = $1.to_i
errmsg = $2
ei = info[errline-1]
src = text[errline-2 .. errline+1].join("\n")
# puts "source lines: #{src}"

raise "Error in #{ei[:file]}:#{ei[:lineno]} (#{ei[:debug_file]}:#{ei[:debug_lineno]}) : #{errmsg}"

else
raise "unparseable error message from parser"
end
end


tree
end

tree = parse_c("real_code.c")

# puts tree.to_s
p tree

Thursday, December 25, 2008

Society as a neural meta-network

Today I was thinking of parallels between a spiking neural network and the society as a whole - it's interestingly looks quite similar.

Let's take a separate neuron - it takes its inputs from a lot of other neurons, with various propagation delays and strengths of the connections. As soon as the neuron is nudged well enough, it spikes, and in turn propagates this impulse to those who are connected to it.

Now, look at humans, with a simple case of no TV and other mass media. Everyone "connects" to the set of their friends, acquaintances, and periodically takes the information from them. Again, depending on the "nudge", this person might propagate the "impulse" - being in this case a piece of the information, to other connections.

The difference between the models is the number of connections - in the brain a typical neuron has something like 10000 connections, whereas a typical first-order "human network" is more like 100-500 people (if we assume the facebook's & other social networks "friends" count as a rough estimate).

Now, putting the Internet into the picture - you can have 10000 inputs pretty easily - you just need to ensure you can process this. Also, you do not need to call/meet to exchange the information - which drastically increases the speed of the exchange.

Now goes a funny question - assuming these two models are somewhat similar, can the meta-network exhibit the self-conscience, similar to its lower-order components, and if yes - how would it show up itself ? And, as parts of the whole, can the individual humans recognize this, or is it something fundamentally impossible ?

Within this line of thought, it is also interesting to consider political systems as "rules for operation" of this kind of meta-network. Then it is pretty clear why the democracy is a reasonably robust system - it attempts to ensure the reasonable levels of connectivity within the network. I've long been saying that the dictatorships are the most efficient forms of government for the period of time that you manage to have a benevolent dictator - but the robustness of those is the worst. The efficiency is obviously the highest - because effectively the whole network is a giant amplifier for a single member. But, there's absolutely no feedback mechanism, so as the key member of the network either destroys itself over time, or "spikes randomly" - there's no mechanisms to prevent the negative effects of that.

With the democracy, on the other hand, a lot of "individual instabilities" are taken care of by the feedback mechanisms. The implications are that "collective thinking" employs a lot of low-bandwidth communications which cap its efficiency.

Is this just a kind of a time-space tradeoff, or there could be a way to leverage the benefits of all ?

conkeror: Copy the current URL to clipboard

As I discover the most frequently used features that I miss, one of them apparently is
to copy the current URL to the clipboard, for subsequent inclusion into web 0.2 tools - like email.

This piece adds the missing functionality. Initially I thought to couple it with editing - something like "edit the current URL and open in the new window", but it seems it is a "nice to have" functionality which I'd never use anyway. So, no dead code!


interactive("copy-current-url-to-clipboard",
"Copy the current URL to clipboard",
function(I) {
var a_url = I.buffer.display_URI_string;
writeToClipboard(a_url);
I.minibuffer.message("URL " + a_url + " copied!");
});



define_key(content_buffer_normal_keymap, "C-c",
"copy-current-url-to-clipboard", $category = "Movement");

Editor for the files on IOS

This is tres cool:

ED-like editor for the files on IOS

Any takers to make vi or emacs emulations ? :-)

Conkeror: the browser for those who dislike the double click

...and single click too, for that matter - you can get by pretty much *only* with keyboard.

I've stumbled across this post from the renowned _why, and decided to try it.

On the ubuntu 8.10, the steps to try it out were:

apt-get install xulrunner-1.9-gnome-support

(NB: I think xulrunner by itself might be fine as well, but "gnome support" seemed like a good thing to have. Maybe it's the Xmas mood that is at fault)

git clone git://repo.or.cz/conkeror.git
cd conkeror
make
xulrunner application.ini

The ability to use only the keyboard to meaningfully operate the browser is pretty refreshing.

The only tough part is that I use the vi as opposed to emacs, so the keyboard bindings are a bit unusual, but maybe it will help me to get the emacs keys into motoric memory - something I tried to do a few times in the past without much success - after 18 years of using vi, one can imagine that's tough - and I've been always falling back to where I am more productive without letting the memory settle.

update: nope, looks like motoric memory has persisted, and actually I ended up customizing the keybindings, rather than reprogramming the low-level memory in the brain...

adding the following into ~/.conkerorrc so far got me going.

define_key(content_buffer_normal_keymap, "C-l",
"find-url-new-buffer", $category="Movement");
define_key(content_buffer_normal_keymap, "<",
"go-back", $category = "Movement");
define_key(content_buffer_normal_keymap, ">",
"go-forward", $category = "Movement");
define_variable("editor_shell_command", "gvim -f");


Saturday, December 20, 2008

sup - a console, ruby-based mail reader with search

Last night I stumbled across sup. I've been a long user of pine. (Yes, I know about the politics behind "free or not" license, and I tried to migrate to mutt - alas. Mutt was too slow for me in managing a lot of folders over imap).

But something like a gmail for my mails... awesome. I grabbed a git version of it, and below go my impressions of the first messing with it during the weekend.

1) don't use the sup-config to add the sources.

use sup-add directly. Though you can use sup-config to generate the command lines, in case you can't read the --help output of sup-add.

2) run sup-update before the first use!

I tend to store a fair amount of mail - with the total size of mails that are accessible at one point in time being about 1-2 Gigabytes. I tried to let the sup itself index it - terribly bad idea! It seems like thread addition becomes slower and slower as the number of threads grows - so after about 800 threads I gave up. There were also a couple of odd unreproducible crashes when I tried to navigate the list of the mails while it was being indexed.

3) ssh+mbox code seems to be slightly out of date - at least it did not work for me.

There's not a big deal to patch there - pretty much all the changes are in unsafe_connect and do_remote. I've hacked the things really not very cleanly, so no diffs. But if you are a beta user, this should be trivial to you :-)

4) password storage in the config file

This is totally not awesome, but easily fixable - e.g. by interfacing with the pinentry executables.

5) the zillion of folders that I had before are not so much convenient to poll.

Under the presumption that I'm going to use the gmail-like approach, I've simplified the procmail filtering, leaving the "to-me" folder (which gets into inbox), "critical mails" - which also get into inbox with other label - and "bulk" folder (which gets archived immediately and labeled "bulk").

Overall the search looks to be reasonably fast - although I only have about 20000 messages, which is not a big amount. We will see how it behaves in a year :-) - and most importantly, during he working week.

----
Update: I found that a few of the mails - seemingly the bigger ones were dropped by some destination system. Interesting. I am curious why.

Update2: well, at least one of them was me fatfingering in the middle of the night. The other one still unknown, but when I resent today, it worked. odd.