the Internet was borked today

What a day — Rogers’ network was b0rked starting around noon, which meant that our office T1 connection flapped up and down all afternoon. It finally seemed to give out entirely around 4 p.m., at which point I think everyone at the office just headed home. Since I have my name on a ticket with Rogers, I’ll definitely be asking for a refund per our SLA.

Matt, a former colleague of mine, told me that Bell had been advertising null routes in the morning too — not good. In general, not a good day for Canadian ISPs.

In other news, I’m working on the Asterisk VoIP server again. As I build and configure this thing, I think it’s best if I keep a list of questions that I will try to answer as I progress. Here’s my current list of questions. If you are an Asterisk guru and happen to know the answers to these, feel free to comment on this post!

  1. Actions in the dialplan are usually applications. Where does one find a list of these applications (and what they do), short of doing a show applications at the Asterisk CLI?
  2. What is the syntactical format of each exten => line in the dialplan? (I know I’ll probably come across this somewhere in the docs, eventually)
  3. How do I register phones — either softphones or physical IP phones — in Asterisk, and how do I get those phones to authenticate to the Asterisk server to self-register themselves? Is there any way to autoregister phones with a restricted dialplan?
  4. How do I know what all the sounds bundled with Asterisk say, short of playing them one-by-one?
  5. How do I configure the voicemail system to do things like e-mail voice messages to the mailbox owner, encode the voice mail in different formats like Ogg Vorbis or MP3, and so on?

I plan to post the answers in this journal when I find the answers.

1986 was not a good year

Yesterday was the twentieth anniversary of the Chernobyl nuclear power plant disaster. I have to wonder: to what extent did the usability, or lack thereof, of the control room instruments contribute to the disaster? I mean, just imagine trying to make sense of a reactor’s status from this panel indicating fuel rod positions. (The complete gallery has more interesting pictures)

Wikipedia’s article on the disaster has this telling quote:

The unstable state of the reactor was not reflected in any way on the control panel, and it does not appear that anyone in the reactor crew was fully aware of danger.

Still — even if the “unstable state” was reflected somewhere on that massive control console, would anyone have been able to find it in the haystack of instruments?

The Three Mile Island accident is another example of how control console complexity can contribute to exacerbating emergency situations. Again, in that accident, there was no reliable instrument to indicate the statuses of various critical components that had failed. Another telling quote:

There is general consensus that the accident was exacerbated by incorrect decisions made because the operators were overwhelmed with information, much of it irrelevant, misleading, or incorrect.

I’m not convinced that control room “habitability” has improved much since these accidents occurred. Just have a look at this section of the newly-refurbished Pickering A Unit 1’s control room and you’ll see that the instrument panels are just as complex as ever. I worry — if another serious incident were to occur in a nuclear reactor today, would the operators be able to correctly interpret their instrument panels in time in order to prevent an accident?

I leave you with this sobering view of reactor #4’s control room as it sits today, courtesy of Robert Polidori.

Control Room of Unit 4 reactor after the meltdown.

replacing a failed Sun LVM mirror

The problem with mirroring your disks is that one side of the mirror will invariably fail two weeks later. This has happened to me several times, first under NetBSD (with its excellent RAIDFrame technology, a worthy competitor, functionally, to Sun Volume Manager) and now with the Sun LVM mirror that I set up several weeks ago and documented in this very blog.

I called Sun support, and they shipped me a new disk. Here’s how I went about replacing the failed device, without incurring any downtime (yay, Sun hot-swappable parts)! Continue reading

rebuilding Asterisk from scratch

As I mentioned in a previous post, I realized that the knowledge I was going to get out of Asterisk was limited by the amount of hand-holding that Asterisk@Home provides. Don’t get me wrong — A@H is a great way to get started with Asterisk, as it comes with a huge variety of features already built in. However, for someone who is a little more happy hacking about and getting to know every nook and cranny of his VoIP system, I realized that I’d have to start over.

I was also eager to rebuild the VoIP server using a BSD. I find Linux to be just too bloated for use as a VoIP server, and I was also interested in seeing how far NetBSD has come from the days when I used it last; my last NetBSD machine ran 1.6, and they’re onto 3.0 by now. I’m very familiar with the progress made by the FreeBSD Project, and am interested to see how NetBSD stacks up. Continue reading

TheDailyWTF.com on AJAX and Web 2.0

If you work in IT, and you don’t already read The Daily WTF, you should. The site bills itself as documenting “curious perversions in IT” and I have to say that this is an understatement; the code that frequently shows up there is bad enough that the word “poor” does not begin to describe it. Sadly, I think much of the code behind so-called enterprise-grade software out there in the world is of the same calibre. We should be afraid.

I had to laugh at their recent skewering of AJAX and Web 2.0. I’ve complained about such things before, but this one does it in such a brilliant way that I really don’t have to say much more:

The introduction of the XMLHttpRequest component opened the doorway for a new breed of “fancy schmancy” web applications like Flickr, GMail, etc. This, in turn, spawned an entire sub-industry and a new series of buzzwords seemingly based on the names of household cleaning chemicals. It even incremented the current version of the Internet to 2.0.

Although the Web is apparently now at version 2.0 much of the software continues to be in beta.

the risks of “outsourcing to the web”

It seems that within the last few months, much has been made of so-called Web 2.0 sites. The fact that it is impossible to even attribute a noun to describe “Web 2.0” — is it a paradigm? a metaphor? a meme? (ugh) — should be enough to convince you that “Web 2.0” is just the latest buzzword to describe webpage design and innovation, but I digress. My objective today isn’t to complain about the use of the term Web 2.0, but to talk about one alarming aspect of these new Web 2.0 sites: the fundamental outsourcing of your private data storage to commercial entities.

Many of the new, highly interactive web properties like Flickr, Gmail, Friendster, and the like, use sophisticated technology — at least in the context of the Internet — to make their sites operate much like thick clients (traditional software running on your desktop computer). One of the primary technologies in use, of course, is Asynchronous Javascript and XML (AJAX), which creates the illusion that you are interacting with a web application without a connection to the server. This has made it possible to create web-side lookalikes of traditional desktop applications, such as e-mail (e.g. GMail), bookmark managers (e.g. del.icio.us), CRM applications (e.g. Salesforce) and even office applications like a word processor (e.g. Writely). A number of factors have made these sites particularly attractive to the end user:

  • ease of use
  • no need to worry about data storage on one’s local (volatile) storage device
  • no need to be maintain one’s locally-installed software, apply security/bugfixes, etc.
  • ability to quickly “upgrade” to new vendor-released versions since the application is centrally-managed

We can expect the adoption rate of these applications to increase both as more users discover their utility, and as more such applications are created.

In the move from the desktop to the web (the so-called “outsourcing to the web”), many issues such as privacy, data retention, etc. are frequently glossed over or simply not recognized by end users as being important. It is difficult for many users to understand even one site’s privacy policy, never mind five or ten. The perplexing question of “How is the data from my personal documents such as e-mails, letters, word-processing files, etc. being used?” may not be adequately answered even by privacy policies, because such privacy policies often cover only the information being stored itself, not any derivative works. By derivative works, I mean that statistical data about your e-mail or Writely documents might be used to target ads to you, or the aggregate statistics of word frequency amongst all Writely users might be shared or sold to marketers for data mining purposes. As one marketing guru said to me recently, the possibilities for data mining are endless (the exact words he used were “we data mine the hell out of things!”)

Another big concern is that many of the applications currently being created revolve around Google in some way. Not only has Google been the primary developer of many rich web applications, with products such as Google Maps, Google Desktop, Google Page Creator, Writely and of course, GMail, but many other developers have taken advantage of Google’s open API to create their own derivative applications (such as Frappr). What happens when Google decides to use the stored user data in new ways? Or what if Google, formerly seen as the benevolent hacker’s workshop, changes its tune and becomes more corporate and controlling like Microsoft? The concentration of power around one publicly-traded corporation should be alarming to any consumer. (I could repeat the same arguments about Yahoo, given attempt to compete in the same space as Google, but by acquisition — their purchases of del.icio.us, upcoming.org, and so on being prime examples of this strategy.)

What is the solution? As I pointed out already, I expect the adoption of such rich applications to increase, not decrease; not only because of their technological merits, but because they frequently build online communities that are appealing to users (and marketers, of course). However, I think that any user who values his or her privacy and finds the notion of data mining based on one’s personal correspondence to be uncomfortable would do well to continue using traditional desktop software to manage these.

are you also evolvolving?


(The above is a hilarious typo in the website for VON Canada.)

This month’s Toronto Asterisk Users’ Group meeting was held at the Voice on the Net Canada 2006 conference. Given the audience (business users and implementers of largely commercial telecommunications equipment) Asterisk was probably a new concept to them, which meant that some of the presentations at TAUG were aimed at an entry-level audience.

Still, there were some really cool Asterisk add-ons demonstrated. One such patch was the Asterisk Real-Time Voice Changer, which lets you alter the pitch of your voice in real-time. It’s great fun for pretending to be a secret informant. Claude Patry, one of the developers of the patch, noted that if you have access to the Asterisk CLI, you can even do this to someone else’s voice call in progress — a very evil use, to be sure, but a great way to get back at your co-workers that piss you off.

Iotum demonstrated their “relevance engine”, whch is basically a rules-based engine for determining priority and subsequent routing of incoming voice calls — so for example, if my girlfriend called me, I could get alerted over instant messenger, but lower-priority folks would get shunted to voicemail. Of course this is a trivial example, as the rules taken into account could also be things like “do I have a meeting scheduled with this caller later in the day”, or “I’m expecting a call from such-and-such a person today”.

I’ll probably be reinstalling my Asterisk@Home system with a regular Asterisk installation sometime soon, so I can get a better idea about how things are all put together.

trying out Asterisk@Home

I’ve recently been getting into voice-over-IP telephony, both due to my dayjob (where I’m now responsible for managing a very expensive but full-featured Cisco VoIP System) and my long-time desire to build a hobbyist PBX at home using Asterisk. I’d set up Asterisk under a FreeBSD 5.4 server some months ago, but got as far as installing a demo dialplan before I got distracted. This time around I decided to give Asterisk@Home a spin, because it bundles many common Asterisk add-ons and features into an easy-to-install ISO backed by CentOS 4.x. (For those who don’t know what CentOS is, it’s basically a straight recompile of RedHat’s popular Enterprise Linux product, and as such, available for free.) Continue reading

Internet nostalgia

I’ve been “on the Internet” (a term which, by the way, makes no sense) for about twelve or thirteen years now, and although this makes me a young ‘un from the perspective of those folks who invented TCP/IP, I still remember enough of the days before the World Wide Web to have some nostalgia for the way the Internet used to be. I bring this up because I just came across a little notebook in which I used to write down relevant URIs and other ephemera. Some of the gems in there:

  • a listing of my favourite Archie servers
  • directions for accessing the e-mail anonymizer that used to live at anon@anon.penet.fi
  • my old FidoNet e-mail address (julian.dunn@f11.n241.z1.fidonet.org)
  • logon information for my various FreeNet accounts such as the now-defunct Cleveland FreeNet
  • a listing of Gopher servers upon which one would have found current price quotes for Macintoshes
  • various sundry BBS telephone numbers that I’m sure are all out of service by now
  • information on how to get to my Dad’s VAX account via Datapac (anyone know if Bell Canada is still operating Datapac these days?)

I was surprised to not only find that I’d written down information for accessing the Internet Oracle (a/k/a rec.humour.oracle) but that the Oracle is still going strong.

Anyone else have old Internet memories they’d like to contribute?