Thursday, November 30, 2006
Wednesday, November 29, 2006
Astronomers using the H.E.S.S. telescopes have discovered the first ever modulated signal from space in Very High Energy Gamma Rays -- the most energetic such signal ever observed. Regular signals from space have been known since the 1960s, when the first radio pulsar (nicknamed Little Green Men-1 for its regular nature) was discovered. This is the first time a signal has been seen at such high energies -- 100,000 times higher than previously known — and is reported November 24th in the journal Astronomy and Astrophysics.
Unlike sniffer dogs which require three months training, it takes 10 minutes to train the bees.
After training three or four bees are put in a shoebox-sized "sniffer box", held in position on plastic mountings. Air is sucked by a fan into the box via plastic tubes and wafts gently over the bees.
If they detect explosives in the air, the trained bees all stick out their proboscises together.
A miniature video camera in the box is trained on them and is connected to a computer programmed with movement recognition software. As soon as the movement of the proboscises is detected, an alarm sounds to alert the security operator.
To avoid false alarms from rogue results, a single bee sticking out its tongue does not set the system off.
DomainKey-Status: good (test mode)
Received: from zps36.corp.google.com (zps36.corp.google.com
by smtp-out.google.com with ESMTP id kAT49pg2026142
; Tue, 28 Nov 2006 20:09:52 -0800
DomainKey-Signature: a=rsa-sha1; s=beta; d=google.com;
In the “extended header” it said:
Sent by: google.com
Signed by: google.com
The “test mode” designation is interesting, too.
Well, looking further at Wikipedia, Google has been using Domain Keys since 2005 and actually went live with them shortly before Yahoo (who developed the concept).
Cool, so here's show you look up the public key to verify (the above) message. The s=beta is the selector. You prepend that to the _domain subdomain, and that to the sending domain d=google.com.
> set query=any
beta._domainkey.google.com text = "g=\; k=rsa\; t=y\;
Here's a nice slide show from Eric Allman.
Tuesday, November 28, 2006
Okay, Ze Frank's video blog is funny. I just watched the 27 November entry about Scrabble. Hah!
(But caution, there is some PG-17 language and maybe more).
Saturday, November 25, 2006
Actually the time period was early for meteors since the radiant had barely risen at that point, i.e., we had barely rotated around to the side of the earth that would encounter the meteoroids.
Saturday, November 18, 2006
Don't forget the Leonid meteor shower.
In North America, for the Maritime Provinces of Canada, New England, eastern New York and Bermuda, the Sickle of Leo (from where the Leonids appear to emanate) will be above the east-northeast horizon just as the shower is due to reach its peak. But because Leo will be at a much lower altitude compared to Europe, meteor rates correspondingly may be much lower as well. However, this very special circumstance could lead to the appearance of a few long-trailed Earth-grazing meteors, due to meteoroids that skim along a path nearly parallel to Earth's surface. Seeing even just one of these meteors tracing a long, majestic path across the sky could make a chilly night under the stars worthwhile.
Unfortunately, for the central and western United States and Canada, the Leonid outburst will likely have passed before Leo rises; at best, nothing more than the usual 10 or so Leonids per hour will likely be seen.
Keep in mind that for New England and U.S. East Coast, the peak is due locally on the previous calendar day, Saturday, Nov. 18, at 11:45 p.m. Eastern Standard Time (For the Canadian Maritimes and Bermuda, the corresponding time is 12:45 a.m. on Sunday, the 19th. For Newfoundland it is also on the 19th, but at 1:15 a.m.).
Thursday, November 16, 2006
Actually, it *is* a Sun engineer joke. The 10 Mbps predecessor was a chip called the "Big MAC Ethernet" -- in this case, it was a mere pun, since MAC stands for Media Access Control, i.e. the Ethernet data-link layer. When they designed the 10/100Mbps chip later on, they decided to turn the pun into a full-out joke, and they called it the "Happy Meal Ethernet."
On a Solaris machine, the interfaces are named by the driver type, so instead of "eth0" you'll see something like "hme0" instead.
Wednesday, November 15, 2006
Tuesday, November 14, 2006
A Chinese submarine stalked a U.S. aircraft carrier battle group in the Pacific last month and surfaced within firing range of its torpedoes and missiles before being detected, The Washington Times has learned.
The surprise encounter highlights China's continuing efforts to prepare for a future conflict with the U.S., despite Pentagon efforts to try to boost relations with Beijing's communist-ruled military.
Sunday, November 12, 2006
Your editor has recently had the opportunity to write a Linux driver for a camera device - the camera which will be packaged with the One Laptop Per Child system, in particular. This driver works with the internal kernel API designed for such purposes: the Video4Linux2 API. In the process of writing this code, your editor made the shocking discovery that, in fact, this API is not particularly well documented - though the user-space side is, instead, quite well documented indeed. In an attempt to remedy the situation somewhat, LWN will, over the coming months, publish a series of articles describing how to write drivers for the V4L2 interface.
V4L2 has a long history - the first gleam came into Bill Dirks's eye back around August of 1998. Development proceeded for years, and the V4L2 API was finally merged into the mainline in November, 2002, when 2.5.46 was released. To this day, however, quite a few Linux drivers do not support the newer API; the conversion process is an ongoing task. Meanwhile, the V4L2 API continues to evolve, with some major changes being made in 2.6.18. Applications which work with V4L2 remain relatively scarce.
V4L2 is designed to support a wide variety of devices, only some of which are truly "video" in nature:
- The video capture interface grabs video data from a tuner or camera device. For many, video capture will be the primary application for V4L2. Since your editor's experience is strongest in this area, this series will tend to emphasize the capture API, but there is more to V4L2 than that.
- The video output interface allows applications to drive peripherals which can provide video images - perhaps in the form of a television signal - outside of the computer.
- A variant of the capture interface can be found in the video overlay interface, whose job is to facilitate the direct display of video data from a capture device. Video data moves directly from the capture device to the display, without passing through the system's CPU.
- The VBI interfaces provide access to data transmitted during the video blanking interval. There are two of them, the "raw" and "sliced" interfaces, which differ in the amount of processing of the VBI data performed in hardware.
- The radio interface provides access to audio streams from AM and FM tuner devices.
Wednesday, November 08, 2006
My experience is based on my Olympus D-40. (Not a DSLR).
Also note that I write this as a former SLR user for more than 25 years. I loved my Olympus OM-1 and miss it in many ways. (I still have it, I just don' t use it any more). Still, I think the digital medium has advantages that don't require going back to the old SLR format. I
My camera has all the controls mentioned, aperture, shutter speed, in camera sharpening, white balance, flash sync. I'm not sure it has a contrast setting. These all require using menus so I bet a DSLR may make some of these easier, but I can do them.
I'm not sure that I buy that a larger sensor (CCD?) is necessarily better. It will inherently have more pixels, but I definitely have enough pixels already with my 3.8 Mpx. My smaller sensor means my lens and camera can be smaller. Larger might be better but I'm not convinced.
I'm not convinced of this one either. I would need to see an objective spec on what the noise actually is. I have not experienced noise as a problem except in very low light situations. My camera does have a low noise setting.
I concede this point. I really wish my camera at least had a way to attach and/orat least synchronize a flash. (I know you can sync with a slave of some type). I don't really feel the need to have lenses outside my zoom range (which is roughly equivalent to 35mm camera 35mm–100mm). A longer zoom range on a camera would be nice.
No shutter lag
This is huge and my greatest complaint about my camera. Newer, non DSLR cameras solve this problem though. A lot of it, I believe, has to do with the time it takes to unload the CCD and write to the memory device. So I don't think this is an advantage exclusive to DSLRs, but just to newer cameras.
Maybe. This isn't that important to me. My camera starts up in three our four seconds. That's probably equivalent to taking a D/SLR out of a bag and removing the lens cap. (My camera fits in my pocket).
Higher Build Quality
Maybe. My camera does have a lot of plastic. It probably wouldn't withstand some abuse a more professional camera would. Still, it's pretty sturdy.
My viewfinder zooms to match the lens. I grant that it's not perfect. The LCD provides a perfect match and I think the LCD is very useful as a viewfinder (though I typically don't use it unless I need to). An D/SLR viewfinder is a beautiful thing, but mine works pretty well.
You need to learn my technique for holding a small digital camera. Thumbs up with your left hand. This creates a very stable platform with your left hand to support the camera. It also keeps your fingers from blocking the lens. Control the shutter, etc., with your right hand.
No way. DSLRs are cheaper that DSLRs used to be, but always more expensive than the other digital cameras.
Tuesday, November 07, 2006
Monday, November 06, 2006
Thursday, November 02, 2006
With LEDs becoming absurdly bright, with high-resolution display technology becoming absurdly small and cheap, and with more powerful devices in our pockets, I expect that in the near future, the device in our pocket will become quite powerful video projectors.
Imagine that you take out your pocket device (assuming you ever actually put it away), check your email and voice mail, make a call, and then it's time for the meeting to start. You tap on a couple of spots on it's interface, set it down on the table and a large, bright display is projected on the wall. It's corrected for the geometry of the projection and the wall so there's no distortion. You don't have to worry about all the tilting, propping, focusing, and repeatedly pressing a keystone button.
You advance your slides, control your movie, or perhaps live video feed from another conference room with either someone else's hand-held device, or a remote of some kind that's IR or BlueTooth. Or with voice commands.
BTW, I think this is on the right track. From using Google apps (the word processor and spreadsheets) I can see this. Your world is on the web and is accessible from anywhere. Folks that say “Bah!” to the idea of a Web OS are missing the point. Yes, it's true the Web OS is not about interfaces to hardware and interrupt handlers. But this is about what's on the screen in front of your face. What's on the screen moves from being housed in the box on or under your desk to the cloud that's the web.
That's initially a fearful thought, but consider that the box on your desk is already part of that cloud, so it's not that much of a shift as you might think. With easy encryption of data, it can be protected and just as protected as it is (and should be) on your desktop computer.
With Google and Microsoft rapidly converging on this idea of a web-based OS, the idea of network computing and the network as the system is about to happen with a vengance that exceeds the wildest dreams of Sun Microsystems.
With Google apps I've already experienced the one-click sharing of stuff and it's very cool, along with the capability of collaboration.
Whither the desktop computer? I think the desktop computer may be soon on it's way out. Maybe not the screen and keyboard. The screens will probably get bigger (wider and taller) and flatter (thinner). I guess they'll ultimately just be “paint on the wall.” But the thing the screen and keyboard talks to will become portable. We'll carry them in our pockets and they'll connect to the screen and keyboard when we need them to. They'll handle phone calls, news articles, Slashdot, blogging, picture taking, music and movies, etc. They'll also project our images (movies, whatever) on any reasonable white wall or surface we can find—large screen anytime!
Based on this article, if they pull off Parakey, it will be open-source Parakey vs. cool Google vs. the dying Microsoft (trying to hang on as the world changes underneath it) in this so-called Web OS world. Microsoft will be too old and unable to change quickly enough to keep up. Microsoft will hang on to it's entrenched corporate user base for a while as Google and Parakey invade from the bottom up the way Linux and Firefox have.
In that future, I imagine that (sadly) Google will become the giant, evil corporate entity that we love and hate, (“No one ever lost their job by choosing Google”) , and someone, perhaps Parakey, will be the cool, new open source solution that competes.