Saturday, January 31, 2009

I found a few issues adding playlists (m3u files) to Mediatomb. It works, but there are problems I have not been able to completely pin down. The main thing is that I have to add the individual playlist files to Mediatomb rather than depending on a timed rescan or a directory. When I time-rescan a directory containing a new playlist (only), the load operation appears in the Mediatomb console, but the playlist never appears in the database. If I add the single file however, and the list file has not been previously scanned, it works.

Also, once a directory is processed by a timed rescan, I have trouble adding new playlist files, individually, from that location. A couple of weeks ago, inexplicitly, a playlist I had previously used without trouble disappeared from Mediatomb, and I could not add it back (no errors, it just didn't show up). I removed my other playlists and found I could not add those back either. So I made a brand new file, and that would not add. To fix this I had to be sure that the directory and all playlists were removed from Mediatomb, and there were no timed rescans on that folder. I added back each m3u file individually, and they worked fine.

I think it's best to stay away from timed rescan, on my installation away, for playlists (I've had no trouble with rescans of music, video and photo files). I should note that this feature is supposed to work. And I should note that I have seen Mediatomb pick up m3u files on timed rescans from my music folders in the past. But from my playlist directory, I've had odd behaviors (no, this is not related to file/diectory permissions in my case, that is a common problem though. Check that if it is not working for you).

For creating playlists I have been using fapg. This worked well for me because I had already some long lists of files to work with and fapg can take these simple lists on stdin. Many people that use Mediatomb and similar servers use other media handler just to create playlists. I have been experimenting with juk. Juk seems to be particularly suited to larger collections (a problem I have with many of these tools is that they simply have not been designed for large and well organized media collections, with correct tags and such). There are other choices. I've seen Rhythmbox mentioned quite a bit. Such tools tend to support media playing or streaming on their own, but they also have nice playlist editors and search features that are useful alone.

One of the reasons I have been working to replace the system I had built myself, is that more standard bits and pieces (like m3u files) will allow me to use a variety of off the shelf packages (there are any number of programs available to edit playlists and now I can use any one, or more than one). There are also other servers and clients that support the same protocols. One does not have to be limited to one combination, if one server is better than another for, say, one type of content or another.

"Pride and Prejudice and Zombies features the original text of Jane Austen's beloved novel with all-new scenes of bone-crunching zombie action. As our story opens, a mysterious plague has fallen upon the quiet English village of Meryton—and the dead are returning to life! Feisty heroine Elizabeth Bennet is determined to wipe out the zombie menace, but she's soon distracted by the arrival of the haughty and arrogant Mr. Darcy."...and so on.

"The images show a young deer running beside the Google car before it can been seen lying on its side. A third screen grab shows the deer lying on the side of the road in the distance as the car moves away.

"The car was travelling in broad daylight down a narrow road in rural New York state when the accident happened."

He complained the apartment was "attic-like" with small windows, low ceilings, obstructed views and ugly drainage grates. He demanded the return of his $10.7 million deposit and $30 million in damages.

In a counter suit, El-Ad accused Vavilov of libel and demanded $36 million in damages.

The Plaza overlooks Central Park and has inspired writers such as Kay Thompson, author of the "Eloise" stories about a 6-year-old girl who lived at the hotel, and Neil Simon, who wrote the Broadway play and hit film "Plaza Suite."

Wednesday, January 28, 2009

Having used a Mediatomb server with a Sony Play Station 3 client for awhile now, I have a few observations...

* Video is tricky to setup. Getting the contents of a DVD into a form that can be steamed to your PS3 is not documented nor easy. It does work, if you are persistent, and get the right software. There are legal grey areas here.

* I have two DVDs from which I could not create an AVI file that the PS3 would play. In both cases, the PS3 would act like it was indeed playing the stream, but there would be no picture nor sound. This did not appear to be related to the bit rate, frame rate or any other setting I adjusted (using dvd::rip) so I have no explanation. Other DVDs I have tried worked fine. Although me sample is small, just 6 or 7 discs.

* One other problem I had with one single disc was that the stream resulted an incorrect aspect ratio. Actually, the video was cut off at the top and bottom and all I got was a very, very narrow band in the middle of the film. Strange. No explanation. Oh, and with another disc, the AVI video froze near the end, while the soundtrack continued. When the content was over, the sound ended normally, but several minuted of video werethus missing. A lot of wierd things happen with this stuff.

* It is possible to get quite good video and sound quality in about a 2-3 GB AVI file. But the quality is not perfect, and there are occasional artifacts. The sound (in MP3) is likewise good, but not as good as it can be. So, as far as video goes, I will not be using the media server for more than a convenience. I can't see converting a library of DVDs to AVI.

* Music playlists work fine, but Mediatomb does not pick up new M3U files when it is set for a timed rescan of a directory containing such files. When it does a timed rescan, the playlist entry appears in the Mediatomb log output and, if you happen to catch it, in the web application's status line. However, the playlist is never made available in the database to select and play, and indeed does not appear in the database section of the web application. Playlists are added to Mediatomb with no trouble when they are individually added with the "add" link, rather than by using a timed scan.

* Scans that Mediatomb does on startup take quite a long time (on my system, I have a lot of content). It would be nice if this could be controlled, and also triggered manually when I know I've changed something.

* On the PS3 side, it would be nice if I could navigate in a more flexible manner by arist and album (such as when I'm playing a file from a playlist, and I'd like to see and play other tracks by that artist or the particular album).

* Also, the PS3 takes quite awhile to transfer complete information from the server when opening a media category. For example, when a playlist contains a couple thousand items, you have to select it, then wait until the PS3 has the entire list transfered. It does not cache this information at all. If you have a large music collection, then opening artists, for example, and browsing for something, is completely impractical.

* If you select a playlist to play before the entire list is transfered, then the PS3 will only play the selections that got transfered up to that point.

* There are three visualizations available on the PS3 for music. Although they are all interesting at first, just having three makes me wish "plain black screen with the track information at the bottom" was one of the choices.

* This setup provides no way to edit M3U files from the M3U. An alternative would be to store media on the PS3 and use it's playlists, but that doesn't give me the distributed interface I want.

* It would be nice if Mediatomb output information about what it is streaming. This would allow me to build alternative interfaces on the server side for playlist control.

* Multi-channel (more than stereo) soundtracks work fine.

* The PS3 will display thumbnail artwork embedded in MP3 files.

* Photos work fine, but the interface is not very robust. I'll stick with Google's Picasa, although leaving my photos volume up to date in Mediatomb's database certainly does no harm and the PS3 provides a good way to show photos to a group.

* I have not experimented with any formats other than AVI and MP3.

On the whole, the Mediatomb/PS3 (Ubuntu 8.10 server) is quite usable, as far as it goes. It doesn't do everything I want, but it's useful enough to keep installed. Next up, I plan to experiment with a streaming server, adding that stream to a Mediatomb playlist. By controlling the stream on the server, I'll have the control I want over what plays. And I'll be able to build a web interface for that control. The downside is that doing that will give up the PS3's ability to do that control 0 the stream will act just like an "internet radio station". But maybe with the combination of both approaches available, I can get (a little closer) to where I want to be.

Tuesday, January 27, 2009

"The Senate acted responsibly to give the Obama administration time to attempt to bring order to a mismanaged process," the West Virginia Democrat said in a statement.

"Many lawmakers worry that an estimated 20 million mostly poor, elderly and rural households are not ready for the switch, which requires owners of older television sets receiving over-the-air signals to buy a converter box or subscribe to cable or satellite TV."

Well... Yes, and that will be true no matter when the switch occurs.The easiest, maybe the only, way to fix the problems that arrise from the change-over is to let the problems occur.

From the WSH, book recomendations on irrational decision making; Extraordinary Popular Delusions and the Madness of Crowds, Judgment Under Uncertainty, How We Know What Isn't So, The Winner's Curse and Predictably Irrational.

"It will be very boring," Joe Friedberg said in a conference call with reporters.

Not only boring, but it's going to take quite a long time (you can thank Norm Coleman for that), due to the Coleman campaign's recent request to have all 11,000 rejected absentee ballots reviewed. This is, of course, after claiming there were no wrongfully rejected absentee ballots in the first place.

Monday, January 26, 2009

Image via WikipediaYes, it turns out that there is a limit on how long an email address can be. According to RFC-2821, section 4.5.3.1, the part before the '@', the "local-part" can be up to 64 characters. The part after the '@', the "domain-part" can be up to 255 characters.

So, including the '@' character, an email address is allowed to be up to 320 characters in total.

In other fascinating trivia, according to RFC-2822, section 2.1.1, the maximum line length in an email is 998 characters, excluding the CRLF.

In most of my current Java SOI/JBI (Netbeans/Glassfish/OpenESB) project, where a BPEL needs to be invoked from inside an EJB, I have done some variation of simply using Netbeans' wizard to create a Web Service Client, and then dragging the operation from the WSDL over into the EJB source editor. This creates a service class, and classes for the message and response that are generally pretty close to what I needs, all using Jaxb-ws.

At one point in the project however, I determined that one of these calls created a really significant performance hit. The hit occurred someplace between leaving my EJB code:MyOperationDataResponse myOperationDataResponse = port.MyOperationDataSoapOperation(myOperationDataRequest);

...and the initial Receive operation in the BPEL (in a different Service Assembly - this might be a factor), which was the earliest I could place a log message. The delay in there was as long as 4-5 minutes, and seemed to be driven by the size of the request payload (about 2 MBs, large but not huge).

According to VisualVM, a great tool by the way, the process was spending a large amount of time in this method:com.sun.enterprise.server.ss.provider.ASSelector.select(long)

...Which was interesting, but did not help me determine a course of action. In the end, I replaced the jaxb-style code with code that calls the web service using XMLBeans and javax.xml.soap.* classes only. I was unable to get the JAXB version of this call to hit the actually service with less than about a 5 minute delay, with the "large" payload I'm using. The equivalent part of this new call happens in .2 seconds (that's point-two). I don't know the repercussions of doing this sort of call vs. the web service client generated by Netbeans, or of having the XML Beans classes in the environment. It would be nice not to have to introduce this other style into the application. This does seem to confirm some sort of issue in either the jaxb-ws layers, or the HTTP Binding Component, though. I suspect the HTTP-BC because the payload size alone does not seem to cause a problem.

The message and response classes are created using the scomp command line utility that comes bundled with an XMLBeans install. I found surprisingly few examples like the above code online, so I hope this is helpful to someone else down the road.

Image via WikipediaHDTV has already been delayed enough. Now more than ever, we need the new technological and economic development that freeing up our old bandwidth allows. If that development isn't "shovel-ready" I don't know what is.

"But one big problem with extending the transition, critics warn, is that many TV viewers could be confused. A delay could also be expensive for broadcasters. And it could burden public safety agencies and wireless companies waiting for the airwaves that will be freed by the shutdown ofanalog signals.

"Government agencies, consumer groups, television broadcasters and other parts of the industry have invested more than $1 billion over the past several years to educate consumers about the shift to digital broadcasting. The message all along has been that analog signals would be shut off on Feb. 17."

Friday, January 16, 2009

"The problem I have with Windows 7, is that Microsoft still seems to confuse simplicity with dumbing down. Windows 7 is supposed to be much simpler, much more trim, and much easier to use. Trying to manage any system settings is an exercise in futility. Just connecting to a local area network was a 12 step program towards insanity. I know Microsoft is trying to answer all the ridicule they get about security, but asking a user to decide security question after security question does not make security “simple.” Microsoft, please read this: Don’t ask a user if they want to open their computer up for sharing to home, work, or public -- block off all sharing unless a user asks to turn it on. Look at how your competition manages to handle security issues. You don’t need to try making it more simple, just as simple"

Nicely put.It's uncanny how Microsoft gets this sort of thing, system configuration and security, so consistently wrong, even in the face of great counter examples that they could just as easily simply copy.

Friday, January 09, 2009

Understanding how Assign operations occur in BPEL can save a lot of headaches, and coding. One of my favorite run-time exception messages from BPEL is this:

ElementNotFoundInComplexTypeException: Particle not found in the complex type.

Assignment problems are commonly at the root of this error, and there are some facts about the Assign operations that are extremely helpful to know.

1. When assigning a fragment of XML from one variable to another, where the schemas of the two variables are in different namespaces, it is usually required to assign each leaf individually.

2. Data can be copied from one part to another, however the source namespace will be preserved, likely leading to undesired results.

3. Repeating nodes should be copied with a loop because any BPEL evaluation must result in a single node, however this specification is not strictly enforced in OpenESB and an Assign will copy recursively through nodes that occur multiple times. Note again, however, that this leavesthe data in the source namespace.

4. Optional nodes be copied at the leaf level by setting "Ignore Missing From Data" to "yes" on the Assign operation. This applies in items 2 and 3 above also.

5. To copy leaf nodes in a repeating set, a loop construct is needed.

Consider the following example which will demonstrate these Assign properties. Two schemas, in two different namespaces, one WSDL using the schemas and input and output respectively, and a BPEL containing an Assign operation.

The output schema is a proper sub-set of the input in that all it's elements match elements of the input, however some input elements are not present in the output. ID2 for example my occur in the input, but not in the output. In addition, the output and the input have different target namespaces.

The following trivial BPEL will be used for these example:

The following Assign operation is setup initially:

This runs fine, and appears to copy input data to the output as needed. For example consider the following input.

All the data is there, however the namespace in the out output is that of the input, "http://xml.netbeans.org/schema/inputXmlSchema". A conventional unmarshaller, constructed using the output schema, applied to this data will result in an empty structure because this message contains no data in the output name space.

How to copy the data into the output namespace? The data must be copied at the leaf level. Consider the following updated Assign operation.

The problem is that Data1 can not be copied in this way. It must be copied at the leaf level. What's more the input, above, does not include an instance of ID1. Removing the assignments of Data1 and ID1, and re-running the above input works however, even though Data1 elements areno longer created in the output:

First, how to copy ID1 as a leaf, when it may or may not exist? One way is to test its existence in an if construct and copy it conditionally, but with complex data this really is not practical. A better way is to set "Ignore Missing From Data" to "yes" on Assign1. First, add an assignmentof the ID1 node to Assign1. Next, in the BPEL builder, go to design mode and select the Assign operation. In the Navigator view, select the assignment of ID1. You can tell which one it is by looking at the properties view.

In the properties view, change "Ignore Missing From Data" to "yes". Here is what the resulting BPEL source will look like:

The service now runs without error and ID1 will only appear in the output if it occurs in the input. Getting Data1 copied requires a bit more work. Firstly, if Data1 was defined to only occur once. Then assigning each leaf element inside Data1 would work, provided that these assignments where all set to "Ignore Missing From Data". The "Ignore Missing From Data" setting would be needed because Data1 includes optional elements, so not all its possible elements will exist. Since Data1 elements can occur multiple times, a loop construct is needed. This example will use a ForEach with a second Assign. To accomplish this, the ForEach and thesecond Assign are added using the BPEL editor. Next, predicates are added to Data1 on both the input and the output. The predicate is the ForEach counter variable. Lastly the ForEach is set to loop from 1 to the count of Data1 nodes in the input. The second Assign copies Data1 elements and needs to have ignoreMissingFromData="yes" on the copies for its optionalfields. Nested complex elements DataField1 and DataField2 are left off the assignment for now.

The resulting assingment will work even though some fields no not exist in the input. Those fields will only appear in the output if they occur in the input. Here is the ForEach source.

Data1's included name/value pair, Data1Field is ignored in Assign2. Data1Field has a required name, and a choice construct for data of several types. Choice constructs can be handled exactly like elements that may occur zero times - with leaf level Assign operations having ignoreMissingFromData="yes". Since Data1 may include zero to many Data1Field instances, this assignment must occur in a loop construct similar to the one above. This is left as an exercise for the reader.