Ironically, the week my InfoWorld subscription seems to have lapsed in the renewal process, Jon Udell, in his latest column, makes some of the very same points I mentioned yesterday regarding the bastardization of email for file sharing.
“It drives me nuts when people send me multi-megabyte files as e-mail attachments. Don’t they know a better way?”
“E-mail is a poor file-transfer solution in many ways, but it makes perfect sense to users. An e-mail with an attachment compresses notification and delivery into a single step.”
Of course Jon is much more eloquent than I in his rant and he follows it up in his blog with some interesting responses.
It is increasingly vexing to me the way email is the defacto standard for sharing files. The reality is that email was never designed for sharing files.
It is an all too common occurrence in a corporate setting where multi-megabyte PowerPoint and Excel files get slammed around to numerous recipients on a distribution list with disparate versions shooting back at the sender and no easy way to consolidate the flow.
Granted there are countless solutions to this problem that document management, collaboration, portals, version control systems and even peer-to-peer have solved, but the reality is that they usually require more effort than simply slapping that bloated file into an email and kicking it out to everyone and their mother.
Ugh! The madness!
Some day there will be a ubiquitous solution that is agnostic to an OS and a hardware platform. I suspect that given the killer-app-ness of email, it will be something that seamlessly grafts itself to that process.
Of course there are some very simple solutions like You Send It, which essentially saves your attachment to its web server then pops out an email with a link to everyone on your distribution list. However, as simple as it is to use, it is still an out-of-context process.
Perhaps next-gen mail servers with intelligent content distribution, version tracking, update/edit reconciliation and off-line synchronization is the way to go. And I’m sure there are solutions like that out there or in the works, but they are far from being everywhere like good ol’SMTP.
Someday the madness will end.
As some of my friends can attest one of my long-standing gripes regarding the usage of Flash has been its inability to be indexed by search engines.
I suppose that argument is now moot since I just read that Google is now indexing Flash files (via Outer-Court and The Unofficial Google Weblog).
I still at times have problems with the usage of Flash. I suppose it’s more so now the miss-usage of Flash when it does not add any extra value over what a simple text description or graphic would accomplish.
However, at least I can now “google” for my favorite flash
This doesn’t seem like a spicy chicken dish to me…
“IBM is set to unveil an upgraded version of its enterprise-level search technology. Code-named “Masala,” the new software is an improvement on Big Blue’s DB2 Information Integrator released last year. It is expected to enable simultaneous search of the Web, internal applications and corporate databases … and will be released in beta in early May. The full release is slated for the third or fourth quarter. ”
“By allowing corporate personnel to search a number of different content sources simultaneously, Masala could be effective in many different scenarios. Sales representatives, for example, could use it to learn about prospective clients by searching internal enterprise resource planning (ERP) systems, as well as information available on the Net.” (Via NewsFactor)
I wonder if Masala is related to WebFountain? Ahh! So it is…
“How is Masala search related to WebFountain?
Masala and WebFountain share technologies but serve different needs. WebFountain is a hosted solution focused on advanced analytics for the internet, while Masala provides search and analytics capabilities for enterprise content.” [more here]
New York Times article on the technological advances going into Lance Armstrong’s equipment for this year’s Tour de France
“This is a mathematical model,” he said, noting that other factors affect performance. “A rider could have a bad breakfast.” (Link via Lockergnome’s Tech News Watch)
MSDN posted this week a series of InfoPath 2003 SP-1 Training Exercises for the recent preview of InfoPath 2003 Service Pack 1.
I haven’t had time yet to run through the exercises, but they seem to be a good primer for anyone interested in utilizing InfoPath. (Thanks for the link Martin!)
MSFT made a preview of OneNote 2003 SP1 available yesterday.
In addition to being able to “Record video notes”, there are a number of other niceties too — like for example inserting documents from other Office programs into OneNote and the ability to share OneNote sessions in real time.
IMHO, OneNote is a wonderful environment for aggregating and now sharing disparate information.
Yet, there’s still no mention of MSFT opening the file format.
Granted they have never been known to be so forthcoming in that regard, but given now the diversity of what you can pack into a OneNote file “archive”, it occurred to me that an ideal export option would be to represent a OneNote archive as a Semantic Web friendly RDF structure.
Yeah, yeah, yeah, I know… Not another Semantic Web rant It’ll never happen.
Well, at the very least, I would still like to see Wiki integration with OneNote. Then again, this is only a “preview”. So, perhaps there’s hope with the Office 2003 SP 1 release later this year.
After reading a bit about the Sentinix GNU/Linux distribution, I wasn’t entirely interested because it’s described as a Linux distribution for network monitoring intrusion detection, penetration testing, auditing, statistics/graphing and anti-spam.
The anti-spam feature seemed to be a minor addition.
That is until I read an article about the current Sentinix release from November 2003 on NewsForge.
Ignore the title of the article too and scroll down to the middle of the page where they mention how the OpenMosix clustering enables it to be a Spam/Virus filtering super-computer.
Specifically, check out these quotes from the article:
“As a sysadmin I have frequently seen the need to add more processing power as e-mail traffic increases. The e-mail server is suddenly overloaded and a solution is needed immediately. With the typical system design, this is never easy, it is always tedious and expensive, and it generally causes down time. So, you follow a period of poor system performance by one of system outage.”
“But SENTINIX is on openMosix. You add a new computer to the network, boot it from the SENTINIX CD, and a node adds itself to the Cluster. In seconds the load is being taken up by the new “temporary” machine and the old server is back to running as intended.”
bknox: “So, you are just using the built-in load leveling of openMosix with these standard e-mail filtering applications? And the results?”
michel: “Thats right, SpamAssassin and MailScanner are processing intensive, use modest IO, and the e-mail handling generates several forked processes. We thought that this would be great fit for openMosix and it is.”
bknox: “OK, I know the theory. Processes automatically move to the available resources. But, the proof is in the results. What kind of test results have you seen?”
michel: “My tests are not rigorous or scientific, but sending a huge number of e-mails to a dual-processor (SMP) SENTINIX node plus one additional openMosix node will generally lower the workload on the dual-processor system and also finish the last e-mail more quickly (20-25% faster with no tuning or special consideration given to the cluster). I will share the details.”
It’s been awhile since the Sentinix distro has been updated, but the mailing list is fairly active — apparently with an upcoming release in the works!
Apparently, the hatch.org domain was erroneously marked as being “On Hold” last night.
So if you sent mail to me @ hatch.org in the last few days it probably bounced.
Although, neither I nor Register.com as been able to determine why this happened (the domain did not expired). The very cordial and responsive Register.com support person told me they have corrected the problem, but since the “On Hold” status has already propagated, it’ll be roughly 24-72 hours before the domain appears active to the net.
I wouldn’t be surprised if the evil customer hating NetSol had something to do with the outage, even though I transferred the hatch.org domain to Register.com nearly two years ago from Network Solutions (VeriSign).
Of course, I have no empirical evidence of this; I’m simply a dissatisfied customer.
Congrats to Greg Reinacker! His newly released NewsGator Media Center Edition is a proof of concept for things to come.
Specifically with the syndication of multimedia content via RSS enclosures.
However, the combination of RSS and PVR’s is not new. For example, the MythNews module for MythTV is actually an RSS aggregator for the open source homebrew PVR project, but MythNews does not support RSS enclosures (Yet! [wishful thinking]).
Personally, I think what Greg is doing in NewsGator MCE along with what Andrew Grumet is doing with RSSTV will have broader social impacts on the distribution of digital content.
For instance, suppose Netflix’s planned video-on-demand service offered an RSS feed with enclosures to its subscribers?
Hypothetically, I can imagine a “Top Rated” feed from Netflix, which is comprised of movies that have been highly rated by other Netflix users. NewsGator users’ would simply subscribe to the feed and receive a fresh supply of choice content.
Indeed, this starts to get into the territory Andrew is paving with RSSTV, which is why I think the synergies of multimedia content and syndication technologies are starting to produce some interesting results.
Soon, changing channels on your TV will be akin to flipping through syndication feeds in your aggregator. Perhaps we’re already there…
Among the many interesting quotes in the recent Business Week Online Article regarding Microsoft’s Midlife Crisis, I found the following quote suggesting that one of the features scrapped from the initial release of Longhorn will be in the updated file system (WinFS).
In particular, it appears that WinFS will not include the ability to index and search corporate file systems.
“Longhorn will now ship with a scaled-back version of the file system. The current plan, in practical terms, means people will be able to search their PCs for documents and information related to each other, but they won’t be able to reach into corporate servers for similar files.” (link via John Battelle)
Excluding this feature from the initial release will certainly give MSFT more development time, but I also think it may be a way of pushing the features into a separate product such as SharePoint Portal Server.
Whatever MSFT’s reasons, the end result is that this exclusion provides another opportunity for Enterprise Search companies (such as Google) to get entrenched in the corporate infrastructure long before Longhorn hits shelves.
A back in my day, we surfed the net with rodents…
Via Wired News: “Back in 1992, when “yahoo” was something cowboys yelled and “ebay” was just pig Latin, the University of Minnesota developed a new way of looking at data on the Internet. Their protocol, called “gopher” after the UMN mascot, allowed archivists to present the mishmash of information in a standard format, and enabled readers to navigate documents on a world of servers using a simple visual interface.
For a while, it seemed as if gopher might open the Internet up to the nontechnical masses and usher in a new era of online communication. It very well might have, if the Web hadn’t come along and done it instead.
Mention gopher to a newcomer to the Web and you might get a blank stare. Mention it to an old-timer and you’re likely to see a nostalgic smile…”
Indeed. Before NCSA Mosaic hit FTP servers and made the WWW usable, Gopher clients/servers were all the rage.
One thing I’d like to point out that the article neglected to mention was the fact that Wired Magazine had its own Gopher Server, which existed back in the days before its web site.
After reading a bit about, Paper Airplane, my first impression is that it sounds a bit like Groove, but differs in that it’s integrated into the browser (Mozilla/FireFox currently) and built on-top of the Java JXTA and P2P Sockets framework. I haven’t tried it yet, but it seems worth a look even in its early beta state.
“Paper Airplane is a Mozilla plugin that empowers people to easily create collaborative communities, known as Paper Airplane Groups, without setting up servers or spending money. It does this by integrating a web server into the browser itself, including tools to create collaborative online communities that are stored on the machine. Paper Airplane Groups are stored locally on a user’s machine. A peer-to-peer network is created between all of the Paper Airplane nodes that are running in order to resolve group names, reach normally unreachable peers due to firewalls or NAT devices, and to replicate content.”
At work, when I evangelize the benefits of using InfoPath as a tool for structured data collection and distribution, I talk about how, IMHO, InfoPath will someday unlock all the black-box business intelligence stuffed into Excel, Word, PowerPoint et al. In addition, I mention that it’s primarily an end-user tool that doesn’t require developers to implement any of those simple form-based workflow processes that deluge most corporations with endless forest killing paper forms.
Invariably I get a response back asking if this can be done in the browser or if users need InfoPath installed on their desktop to enter data into forms. I regrettably say, “For now, the answer is yes, but I think that will change.”
Well, it has been roughly a year since I first started playing with and touting InfoPath’s virtues. Unfortunately however, it appears that a ubiquitous InfoPath runtime is still not available.
Apparently, I’m not alone with this gripe either.
Today Jon Udell quotes an anonymous InfoPath user regarding this missing element:
“I believe a primary requirement of a forms application is to make it possible for the form to be completed by a wide audience of people from whom I wish to gather data. A key driver, at least in the world of my customers, is to be able to distribute the form widely to people who aren’t necessarily connected to the network and get them to fill it in and return it. I don’t want to authenticate these people in my network. They won’t install software on their computers just to fill out my form. They don’t want to learn a new application.”
“There is no ability to save the form template as an ASP.NET web form.”
I think the last line is killer and doesn’t seem to be technically difficult.
For a little over a year I have been using a modified version of Julian Bond’s Google News to RSS script to pull news searches into my aggregator. I even had it pulling news feeds into a corporate intranet until the feed was deemed too “unfiltered” for corporate consumption.
If Google’s API included news (and other services such as Froogle, Groups and even Images), I, and I’m sure many-many others, would switch to the sanctioned API service in a heartbeat! Until that time, I’ll likely still use the script.
Frankly, like Julian, I’m surprised that Google hasn’t extended its API to include News and other services.
Julian Bond: “I shouldn’t really complain as I’m fairly clearly breaking their terms. However, I’m getting increasingly fed up that they don’t have an XML (RSS or Atom) output from their search results. It’s also become pretty disappointing that their SOAP API still only covers the main search engine and hasn’t been extended to support the other parts of Google.”
Clearly there’s demand as this hubbub demonstrates. Come’on ‘G’ cease with the desist letters and extend your API!