Seconde Nature Second LifeLast changed: 2009/02/07 23:39 |
||
---|---|---|
Main page / Recent changes go back | refresh | help | Pages Index | create a page | upload | What links here? | locusonus.org | Show source / History | |
Menu : Second Life
Locus Sonus at Seconde NatureAlejo Duque, Scott Fitzgerald The first notes on the project are here: Seconde Nature ADD YOUR AFTER THOUGHTS RELATED TO THE SECOND LIFE PROJECT PRESENTED AT THE SECOND NATURE FESTIVAL SL TrackingBackgroundThe process required to get the position of an object from Second Life into a Pure Data patch is fairly straightforward. There are some “gotchas” outlined below. These primarily lie in the Second Life domain. Second Life has its own scripting language, with the uninspired name of Linden Scripting Language (LSL). The scripts can be attached to any in-world object you have appropriate permissions for, including your body. LSL allows for http requests to be made from inside the world to connect to external content (web pages, audio, video). By making a request to a webserver running PHP, we can send the x,y,z coordinates of the object we are interested in. In this instance we are also sending the object’s rotation along the z-axis in radians. When the request is made, the PHP parses the information, and opens a UDP socket on the local machine. We have a Pure Data patch listening on the socket for incoming information. The information is routed according to the object’s name, giving us the position of each object inside the virtual space. ImplementationThere were several different iterations of the LSL script, for various reasons. LSL throttles http requests, limiting them to about 1 second per object. If you have a request attached to a large object (about the size of an average avatar), the object loses “energy” over time, it cannot continue to make requests until that “energy” has replenished (achieved by not making requests). Another issue we encountered is that the objects will report bogus location data, intermittently reporting their location was somewhere different than where they currently were. This happened irregularly, and usually appeared after about an hour of “rezzing” an object with the script. The original script we were using only reported an object’s beginning and end position. No updates between were registered. This proved problematic, as objects could move from one side of the space to the other, sounding as if it happened instantaneously. The second iteration had the objects constantly updating their location, whether they were being moved or not. This became problematic for the reason mentioned above, they would erroneously report their location, stating they were somewhere else (often time this appeared to be a previous location). A compromise was reached, whereby the object would report its location every second while it was being touched. Random location data would still sneak in while a participant would be moving something. However, it was much less frequent and would not cause problems once the object was positioned. One unfortunate side effect of this is that if an object’s position was changed through collision with another object or avatar, or if momentum carried it away from the place it was last touched, its location would no longer be accurate. One other bit that needed to be added to the LSL was a boundary checker. We did not want the object to leave the confines of the Locus Sonus parcel of land. However, there was ample opportunity for people to move the objects out of our space. This was alleviated through a small script that constantly checked if the object was within our land’s coordinates. If it was not, it would be turned “phantom” (to allow the object to pass through walls), moved to the silent zone. Once reaching the resting place, it would be returned to its normal “physical” state. CodeThe LSL bit
2) php script {<?php /* Script for getting object type, name and data via http and sending to a udp socket -Scott Fitzgerald apr 08 based on - Robb Drinkwater, Aug. '07, Jan. '08
$type = $_REQUEST'type'; // get object type $obj_name = $_REQUEST'name'; // get object name $pd_data = $_REQUEST'data'; // get object data echo "<h2>UDP Connection</h2>\n"; /* Get the90.4.219.15address for the target host. */ $address = gethostbyname('localhost'); /* Create a UDP/IP socket. */ $socket = socket_create(AF_INET, SOCK_DGRAM, SOL_UDP); if ($socket === false) { echo "socket_create() failed: reason: " . socket_strerror(socket_last_error()) . "\n"; else { echo "OK.\n"; }//open a port echo "Attempting to connect to '$address' on port '13001'..."; $result = socket_connect($socket, $address, 13001); if ($result === false) { echo "socket_connect() failed.\nReason: ($result) " . socket_strerror(socket_last_error($socket)) . "\n"; } else { echo "OK.\n"; }/* Catch if data is Second Life vector format and reformat */ if(strstr($pd_data, '<') == true) {//print "found vector format"; $as_list = str_replace(array("<",">"), "", $pd_data); // strip lt/gt $as_list = str_replace(",", " ", $as_list); // replace commas $pd_data = $as_list; }$formatted = $type." ".$obj_name." ".$pd_data."\n"; // Formatted as raw 'type','name','data' list //if the "formated" string length is greater than 1 (i.e. we assume it gets data) //open socket connection and send data if (strlen($formatted) > 1) {socket_send($socket, $formatted, strlen($formatted), MSG_DONTROUTE); } ?> }} 3)pd patch
ConclusionsI think it would be best for Locus Sonus to consider moving to a different platform, or tool, for creating aural networked virtual spaces. Second Life has one thing going for it: a built in user base. However we did not capitalize on that base for a number of reasons. With an estimated population density of 87 people per sq km, one has to wonder what second life really offers us in terms of interaction with people from around the world. A last minute email sent to the locus sonus email list, and no "in world" publicity accounted for (from what I saw) 2 visitors in second life who visited the space during the seconde nature festival, that were not physically present at the event. If the population of Seconf Life is 1) not informed about LS and the work and 2) nowhere near the event at the time, then it defeats the purpose of using such a space, as the pre-exisiting community offers us nothing. The participants at seconde nature certainly enjoyed themselves, but it's hard to say how many of them thought of it as a "game" or as a "second life thing" and not as a sound work. wrestling with uncumbersome controls and using machines that are not designed to run the simulator did not help people experience what the work should have offered. Also, as we witnessed, the second life scripting language has a large number of flaws that seriously inhibit accurate position tracking, making the environment not ideal for what we wished to achieve. Drawing from the above, we can assume that Second Life failed us for the following reasons : 1) Poor UI : the controls are awful and people in the space who had never used SL before did not know what they were doing (even the SL team, having used it for several weeks prior, was still unable to control avatars and objects precisely) 2) unreliable information : the second life scripting language sends bogus data and is generally prohibitive for this type of communcation 3) Lack of Community : the one asset that second life does have, a large user base, is offset by the fact that a) there was no "in world" advertising of the event and b) population density in the simulator is so low that a "walk-in" is unlikely 4) misunderstanding of the intent : people perceived the whole experience as a "game" and their take-away experience was more about that aspect, the "game second life," and not a sound piece, detracts from the intent. Having said all that, I think the ideas we worked with (virtualized sound spatialization/linking the virtual and the real), are valid points of investigation. Perhaps LS could explore various other virtual spaces to work with. Panda http://panda3d.org/ , as mentioned by Alejo, and used by SAIC, can run over the network, and allow many people to log in remotely. Apparently it has a sound synthesis engine as well. Ogre http://www.ogre3d.org/ is another open source 3d engine, though I am not sure of it's ability to be networked (interesting project though : http://jitogre.org is a project that exposes Ogre to Max/MSP/Jitter). Of course, there's also the possibility of creating a 3d environment in GEM, and streaming it to clients, for another approach. Obviously it wouldn't have all the functionality of an ogre or panda, but could serve as a simple sandbox).
|
||
Powered by LionWiki 2.2
2024/12/27 00:09 -- 3.135.247.237 Erase cookies |
Show source / History |