6/10/2002 9:04AM OK, so I think they are going to try and put the 3D replay stuff on the cruise in August up in Oregon. So, I basically need to learn as much as possible about this stuff as fast as I can. I will start with the web page, since this is the user interface. Most people will probably get to it from the expedition database queries. So I looked over the past 30 days of WF cruises and did not find any, so let's query for one. Ooops, forgot, I have to be looking at it through Netscape... if you are using IE, you don't even get the option. Before I forget, the first page you go to is the Cruises web site (index.htm). From there, you can click on several links that will take you to the postcruises.asp page and send in different query parameters. The query parameters are (some examples): 1. step=2 2. ShipName=ptlo 3. Conjunction=AND 4. Dates=Future 5. search=advanced 6. Continue=Searc If you click on the past 30 days for wfly, you might go to a link like this: http://mww.mbari.org/expd/log/postcruise.asp?step=2&qShipName=wfly&Conjunction=AND&qDates=last30&search=advanced&Continue=Search+for+expeditions So the postcruise.asp page is an important one. Let's look at that first. It is actually a perlscript page. Somehow the page needs to find out if there are 3D replays available and create the links on the postcruise.asp page. If first imports some functions for header stuff and expd functions. The postcruise_hdr.inc file which creates a javascript function which check the browser and sets up some custom stuff based on the browser. It also creates the start of the html file and then starts a table with some common header images and links. The expd_functions.inc file contains declarations of some common functions that access the expeditions database. The subroutines are: 1. open_database: Takes in the database DSN and opens up a connection to it using ADODB module in perl. 2. FixString: Put quotes around text values and nothing around numbers escape embedded quotes. 3. toGMT: Convert local time array to a GMT string that is good for a database DTG load. 4. toGMTfromES: Convert local time array to a GMT string that is good for a database DTG load. Should have overloaded toGMT, but I feel this is safer. 5. toYDfromDate: Compute Year and YearDay given a GMT m/d/y. 6. URLEncode: Make variables safe for a URL. 7. dataLinks: Do DB query against ExpeditionData table and output href links a text. Assumes $Conn object already opened. This function is to be called within an area where the DB is opened. It basically takes in ** There is some 3D replay stuff in here. 8. close_sesssion: Close the session. Has the effect of clearing out all the session variables that are accumulated in preparation for the database load or email sent to logistics coordinator. If the user is a registered user, we do need to remember that session variable so that if she goes on to enter another postcruise it hasn't forgotten her identity. close_session() should be called whenever a post or pre cruise edit session has finished. After including the above files, the use statements are included to import the following: 1. Win32::ASP (Don't use CGI as it will hang IIS) 2. OLE 3. Time::Local 4. POSIX The script then gets a copy of the name of the script (postcruise.asp), the dsn from the "Application" hash and defines the database type as 'mssql'. It then grabs some variables and places them in session variables. The script looks for a query parameter called "step". This is basically the menu step where the script branches and builds whichever page is appropriate. A check is performed of the incoming parameters and based on those, certain functions are called to build the page. Here are the query params and what functions are called: step = 1 (or nothing) --> --> search = advanced --> advanced_search() --> step1() step = 2 --> --> data_table = yes --> step2('data_table') --> step2() step = 3 --> step3() step = 'display' and query param 'ExpeditionID' exists --> display_expd() if step does not exist or is not any of the above and the query param 'finish' pattern matches with 'done' --> process_form_entries() --> if the session parameter 'RegisteredUser' exists --> update_expedition();display_expd();finished_email(); --> else -->email(); --> close_session(); if none of the above and the query parameter 'finish' pattern matches with 'dive information' --> process_form_entries(); --> if the session parameter 'RegisteredUser' exists -->update_expedition();finished_email(); --> else email(); --> close_session(); Then the script includes the postcruise_ftr.inc file for the footer HTML. After all this is done, the functions are defined. Now as far as the 3D stuff goes, I don't care about too much except for how the URL gets created when an expedition is found. For the purposes of this investigation, I will assume that clicking on the link for the "Past 30 days" is very similar to an advanced query. When you do this, the parameters are set to: step = 2 ShipName = wfly Conjunction = AND Dates = last30 search = advanced Continue = Searc Based on the above, the function step 2 should be called: So in step 2 function: 1. Open the database connection 2. Grab the conjunction from the query parameters (AND) 3. initialize a variable called $has3Dlinks to 0 4. Copy all the form values into a hash to work with 5. Constuct the date part of the SQL statement based on the qDates field 6. Then execute the SQL and build the HTML Page If there is 3D replay data available, it basically takes the URL(s) from the ExpeditionData tables and places them in the HTML within a javascript statement that tells to open it in another window. The page that it opens is the 3D.asp or: http://mww.mbari.org/3Dreplay/3D.asp?dives=tibr425 -------------------------------------------------------------------------------------------------------------------------- 3D.asp: Similar to the last web page, this page first includes the expd_functions.inc page. Then it uses the following perl libraries: 1. Win32::ASP 2. VRML::VRML2 3. OLE 4. LWP::Simple A variable called $wrlBase is set to "http://mww2.shore.mbari.org/ARCHIVE/3Dreplay". --------------------------------------------------------------- In order to run the geopdvc.pl script, on the local machine you need to use PPM to install: 1. VRML.pm 2. Statistics-Descriptive.pm (also must have libwww-perl (installed with activestate I believe) 6/18/2002 7:43AM OK, I think IS is ready with the PC's (I will have to get a status update from Todd). Currently the systems are as such (on shore): I spoke with Cathy and the Wharfrat server was pretty much ready to go. I looked over the 3D stuff and the first thing to do is basically copy the 3DReplay directory from Tornado to the Wharfrat machine. Wharfrat has two drives (C and D) and D has 60GB, so I will copy it to there. I VNC's to Wharfrat and then mounted the ShipData drive on Tornado. It connected as F drive. I then copied the 3DReplay directory to the root of the D drive. In addition to this, I need to get a copy of the expedition database web site over to Wharfrat ... maybe I will have Nancy help me with this? I need to look at the setup for IIS for the expd pages. I connected to Typhoon using Wharfrat's IIS MMC and looked at the expd web page first. It is setup as the following: - Virtual Directory Tab -- Content comes from local directory (D:\expdlog) Access permission are read Content control is log access and index this directory Application settings: name = expd Starting point: /expd Permissions: script - Documents Tab -- Enable Default Document (checked): - index.html - index.htm - Default.htm - default.html - Default.asp Enable Document Footer (unchecked) - Directory Security Tab -- Edit button: - Allow anonymous access (unchecked) - Basic Authentication (checked) - Windows NT/Challenge Response (checked) So this means that on wharfrat, I need to setup the expdlog directory and copy it from typhoon and then setup the virtual directory on wharfrat's web server. ***** NOTE: I need to find out what the explorer permission are on the expdlog directory on typhoon. I copied all the files in the expdlog directory on typhoon to the d:\expdlog directory on wharfrat. I also looked at the IIS settings for the 3dreplay web app. - Virtual Directory Tab -- Content comes from a share located on another computer (\\tornado\shipdata\3dreplay) Access permission are read Content control is log access, directory browsing allowed and index this directory Application settings: name = 3Dreplay Starting point: /3Dreplay Permissions: script - Documents Tab -- Enable Default Document (checked): - index.html - index.htm - Default.htm - default.html - Default.asp Enable Document Footer (unchecked) - Directory Security Tab -- Edit button: - Allow anonymous access (unchecked) - Basic Authentication (checked) - Windows NT/Challenge Response (checked) I tried to access the expd/log/postcruises page and got an error for no ASP.pm installed. So I ran the ppm to install that. The page ran, but then I got an empty results set which is probably because I don't have the DSN setup. I will try that. Nope, I suspected not because all of the login stuff is in the code, it probably does not need a DSN. Maybe we are missing the OLE module. I installed Win32-OLE and will try again. Nope, better start digging into the code. In the global.asa file, I changed the following: $Application->{'BaseURL'} = "http://mww.mbari.org"; to $Application->{'BaseURL'} = "http://wharfrat"; Just to see if that helps. It will have to be changed again later. Well it gets me to a different error. This error is like it could not get the object that is the connection to the database ... could be because I don't have Win32-ADO installed ... try that next. Nope, what is up with this?! OK, after much research, it basically looks like it comes down to the fact that the way the recordset was created was not valid. So I changed: $Conn = CreateObject OLE "ADODB.Connection"; to: $Conn = Win32::OLE->new('ADODB.Connection'); and then changed: $RS = $Conn->Execute($sql); to: $RS = Win32::OLE->new('ADODB.Recordset'); $RS->Open($sql,$Conn,adOpenKeyset,adLockOptimistic,adCmdTable); and then it worked OK. I also changed: $Application->{'DSN'} = "Server=godzilla;Database=expd;UID=expddba;PWD=password;"; to: $Application->{'DSN'} = "Server=alaskanwind.wf.mbari.org;Database=expd;UID=expddba;PWD=password;"; and: $Application->{'BaseURL'} = "http://mww.mbari.org"; to $Application->{'BaseURL'} = "http://wharfrat.wf.mbari.org"; ------------------------------------------------------------------------------------------------------ First in the expd_functions.inc I changed: $Conn = CreateObject OLE "ADODB.Connection"; to: $Conn = Win32::OLE->new('ADODB.Connection'); Now onto to postcruise.asp... in the subroutine advanced_search, I changed: $RScs = $Conn->Execute($sql); to: $RScs = Win32::OLE->new('ADODB.Recordset'); $RScs->Open($sql,$Conn,adOpenKeyset,adLockOptimistic,adCmdTable); and: $RSrgn = $Conn->Execute($sql); to: $RSrgn = Win32::OLE->new('ADODB.Recordset'); $RSrgn->Open($sql,$Conn,adOpenKeyset,adLockOptimistic,adCmdTable); STILL HAVING MAJOR PROBLEMS ..... OK, after much abuse, I realized that the SQLOLEDB provider was not installed, so I installed MDAC on the wharfrat server!!! ARRRRRRRRRRRRRGGGGGGGGGGGGGGGG what a waste of time!!! YES, that was it ... well after a ton of wasted time ... we are back online. Now switch DSN to alaskanwind. (By the way, I reverted all the above changes). OK, the alaskanwind connection has a problem, but Debbie says she can fix. It looks like I am back on track. I created a visio diagram to show the two differences: I will upsdate that diagram as I go. For now, I will work through the pages. GLOBAL.ASA Everything looks OK, there are some changes I marked in red that are questions or future things to do, ask Mike. POSTCRUISE_HDR.INC Converted a link to a image on the external web server to a local image link and I removed the fact that it is a link to a web page. EXPD_FUNCTIONS.INC Changed: my $expd3Dlink = "http://expd.shore.mbari.org/3Dreplay/3D.asp?"; to: my $expd3Dlink = "http://wharfrat.wf.mbari.org/3Dreplay/3D.asp?"; POSTCRUISE.ASP Outside of the changes made already, there is not much more for now. I will go over the code with fine tooth comb later. 3D.ASP I had to install the VRML module on wharfrat. The 3D.asp file points to the mww2 server which is lepas. They all seems to have a ARCHIVE directory of some sort so I think the best thing to do is to store all the data in a ARCHIVE directory on the D: drive on wharfrat. So I created one. Actually the 3dreplay directory is the one we want, I need to create a mapping (virtual directory) on the IIS server for it (didn't I already? Yes, I did). I changed: $wrlBase = "http://mww2.shore.mbari.org/ARCHIVE/3Dreplay"; to: $wrlBase = "http://fww.wf.mbari.org/ARCHIVE/3Dreplay"; my $_dsn = "Server=godzilla;Database=expd;UID=everyone;PWD=guest;"; to: my $_dsn = "Server=alaskanwind;Database=expd;UID=everyone;PWD=guest;"; to: Web calendar ·   Lobos cruises ·   Flyer cruises ·   Precruise entry ·   Keyword search ·   Advanced search ·   Administrative page ·   Cruise planning & scheduling ·   Shipboard video procedures to: Web calendar ·   Lobos cruises ·   Flyer cruises ·   Precruise entry ·   Keyword search ·   Advanced search ·   Administrative page ·   Cruise planning & scheduling ·   Shipboard video procedures (COULD BE SOME FORMATTING ISSUES HERE, NEED TO CHECK) BIG NOTE HERE: There is something in the code that references: http://www.geovrml.org/1.0/protos/GeoOrigin.wrl I suspect I will need to copy that to a local directory to make sure we don't need the web, I will have to wait and see. again another one of these: my $_dsn = "Server=godzilla;Database=expd;UID=everyone;PWD=guest;"; to: my $_dsn = "Server=alaskanwind;Database=expd;UID=everyone;PWD=guest;"; replaceString("$dir\\$file", "search.shore.mbari.org/ARCHIVE", "localhost"); replaceString("$dir\\$file", "menard.shore.mbari.org", "localhost"); replaceString("$dir\\$file", "expd.shore.mbari.org", "localhost"); to: #replaceString("$dir\\$file", "search.shore.mbari.org/ARCHIVE", "localhost"); #replaceString("$dir\\$file", "menard.shore.mbari.org", "localhost"); #replaceString("$dir\\$file", "expd.shore.mbari.org", "localhost"); replaceString("$dir\\$file", "fww.wf.mbari.org/ARCHIVE", "localhost"); replaceString("$dir\\$file", "fww.wf.mbari.org", "localhost"); #replaceString("$dir\\$file", "expd.shore.mbari.org", "localhost"); to: changed:     in \\tempest\tempbox\\3Dreplay to:     in d:\\temp\3DReplay #my $tmpDir = "\\\\tempest\\tempbox\\" . GetFormValue('dir'); to: my $tmpDir = "d:\\temp\\3DReplay"; and created a temp directory on the d: drive. replaceString($file, "search.shore.mbari.org/ARCHIVE", "localhost"); replaceString($file, "expd.shore.mbari.org", "localhost"); replaceString($file, "dods.shore.mbari.org", "localhost"); to: #replaceString($file, "search.shore.mbari.org/ARCHIVE", "localhost"); #replaceString($file, "expd.shore.mbari.org", "localhost"); #replaceString($file, "dods.shore.mbari.org", "localhost"); replaceString($file, "fww.wf.mbari.org/ARCHIVE", "localhost"); replaceString($file, "fww.wf.mbari.org", "localhost"); #replaceString($file, "dods.shore.mbari.org", "localhost"); and again: replaceString($file, "search.shore.mbari.org/ARCHIVE", "localhost"); replaceString($file, "expd.shore.mbari.org", "localhost"); replaceString($file, "dods.shore.mbari.org", "localhost"); to: #replaceString($file, "search.shore.mbari.org/ARCHIVE", "localhost"); #replaceString($file, "expd.shore.mbari.org", "localhost"); #replaceString($file, "dods.shore.mbari.org", "localhost"); replaceString($file, "fww.wf.mbari.org/ARCHIVE", "localhost"); replaceString($file, "fww.wf.mbari.org", "localhost"); #replaceString($file, "dods.shore.mbari.org", "localhost"); Now in the alaskanwind datbase, I had to change all the URL's over in the data: In table ExpeditionData, in the field URL I changed: http://expd.mbari.org to: http://wharfrat.wf.mbari.org http://mww.mbari.org to: http://wharfrat.wf.mbari.org http://search.shore.mbari.org to: http://wharfrat.wf.mbari.org I used the SQL Query Analyzer and used the following (example): begin transaction update ExpeditionData set url= stuff(url,8,3,'wharfrat.wf') where url like 'http://mww.mbari.org%' commit Similarly to the terrain tables, I did the following: In column TopTileURL and TopImageURL, I changed: http://menard.shore.mbari.org/vrml/terrain to http://wharfrat.wf.mbari.org/vrml/terrain NOTE: I left the godzilla references to the samplesDB, just for now, probably want to change those later so you get broken links instead of trying to reach godzilla over the network. OK, due to some oddities and unresolved links, I changed all the fww.wf's back to wharfrat.wf. I added a virtual directory to IIS that points to the ARCHIVE directory on the D: drive. Then I created another virtual directory that that pointed to the D:\3Dreplay directory but was aliased to: http://wharfrat.wf.mbari.org/ARCHIVE/3Dreplay and it now works. I have to get the terrain data off menard now and over to wharfrat. popExpeditionData.pl This one is going to be tough. Changed: $_dsn = 'Server=godzilla;Database=expd;UID=expddba;PWD=password;'; to: $_dsn = 'Server=alaskanwind;Database=expd;UID=expddba;PWD=password;'; DOWD.pm I changed: $self->{RootWeb} = 'http://search.shore.mbari.org/ARCHIVE'; to: $self->{RootWeb} = 'http://fww.wf.mbari.org/ARCHIVE'; NOTE: This one could be intersting, because I believe it is looking for .dat.gz files which we do not have, although I guess I could just create those in the field so the script does not need to be altered. OK, so looking at the code, the script is expecting the following directory structure: Framegrabs: http://wharfrat.wf.mbari.org/ARCHIVE/frameGrabs/Tiburon/stills/2002 ROVCTD http://wharfrat.wf.mbari.org/ARCHIVE/rovctd/tibr/2002 Logger http://wharfrat.wf.mbari.org/ARCHIVE/logger/2002/wfly CamLog http://wharfrat.wf.mbari.org/ARCHIVE/camlog/2002 Navigation http://wharfrat.wf.mbari.org/ARCHIVE/nav/2002/tibr I commented out the following lines: #$PDirs{rovctd} = [ qw(rovctd) ]; #$PDirs{CameraLog} = [ qw(logger camlog) ]; and #$PltDirs{rovctd} = [ qw(vnta tibr) ]; #$PltDirs{logger} = [ qw(vnta tibr) ]; #$PltDirs{camlog} = [ qw(vnta tibr) ]; and changed: foreach $DataType ( qw(frameGrabs rovctd CameraLog Navigation) ) { to: # foreach $DataType ( qw(frameGrabs rovctd CameraLog Navigation) ) { foreach $DataType ( qw(frameGrabs Navigation) ) { I am hoping this will leave out the stuff for the rovctd and camlog (which I don't think we need). After running it, I found the fww.wf was not out on the flyer (Todd fixed in a jiffy). I also found that IIS does not insert \n's on the directory browsings, so I will have to use something else like
. Nope, how about split on ' on beroe: Navproc stuff -> on batray: /wflogger/data ->: YYYYDDDadvlogr.dat YYYYDDDdataprobelogr.dat YYYYDDDdatavislogr.dat YYYYDDDdsledlogr.dat YYYYDDDrovctdlogr.dat YYYYDDDsamplelogr.dat YYYYDDDshipnavlogr.dat YYYYDDDvideologr.dat 7/3/2002 9:00AM OK, the boat is back and I am staring to work on it again. Mike mentioned something in a meeting that the perl that is installed on the ship should be 5.2.2 so let's do that first before we go any further. OK, so easier said then done. I was getting the ADO '0115' error and had to turn on server side script debugging to figure out I was missing some modules. I installed the Win32-ASP module but starting getting this error: # Dive table fields # error '80004005' Can't call method "BinaryRead" on unblessed reference at D:/Perl/site/lib/Win32/ASP.pm. BEGIN failed--compilation aborted line 13. Whatever the heck that means. I searched and tried and searched and tried and I am down to the belief that I need an old version of Win32-ASP and I cannot find it. I have sent out pleas for help. In the meantime, I will re-install the latest perl and keep on working. OK, the popExpeditionData.pl script is the one I am working on now. In order to get the framegrabs to work correctly. I had to use a URL redirection in IIS. I created a framegrabs directory in d:\ARCHIVE and then set the properties on that directory in IIS so that it would point to the following URL--> http://beroe.wf.mbari.org/ It seems to be working from the browsing perspective, we will have to see if it works in the script processing. OK, now we need to be able to access the navigation data. The problem is that in order to do this, it must go through a processing script to move it to a directory on the ship where it can be accessed via the web. This means I need to copy/emulate the script that run on barnacle for at least the navigation data. Hmmm ... OK, I am printing those out to go over them. Wow, these open up a can of worms. I wonder if it would be easier just to add a step to FTP the nav logger files from batray to wharfrat. Maybe so. OK, so I will use batray with the account wflogger. I created a directory called 3DReplay in the /users/wflogger directory to house all this stuff. I copied over .env_tibr from barnacle to /users/wflogger/3DReplay and edited: changed: export HOME="/u/rovctd" to: export HOME="/users/wflogger" I then added a 3DReplay directory in all the other directories that were listed in this environment and changed mww.mbari.org to fww.wf.mbari.org. NOTE: IN ORDER TO RUN ALL THIS STUFF CORRECTLY, IT HAS TO BE DONE IN A BASH SHELL WHICH IS LOCATED ON /usr/tiburon/mbari/local/bin WHICH IS A DRAG, BUT IS NECESSARY. I then FTP'd the moveLogrFiles.perl from barnacle to batray to the 3DReplay/bin directory. I changed the following in this script. In the initialize procedure, I changed: $runPath = "$HOME/bin/"; to: $runPath = "$HOME/3DReplay/bin/"; I changed: $filetype = "shipnavlogr.dat.gz"; to: $filetype = "shipnavlogr.dat"; Changed: $trnsPath = "/users/datatransfer/$shipName/"; to: $trnsPath = "/wflogger/data/"; Changed: $status = system (join(' ', "mv -f", $trnsFile, $datPath)); to $status = system (join(' ', "cp", $trnsFile, $datPath)); Now for the file processNav.perl ----------------------------------------- Changed: $savPath = "/ARCHIVE/logger/"; to: $savPath = "$HOME/3DReplay/ARCHIVE/logger/"; Changed: $archPath = "/ARCHIVE/nav/"; to $archPath = "$HOME/3DReplay/ARCHIVE/nav/"; Changed: push @INC, "$HOME/lib/"; to push @INC, "$HOME/3DReplay/lib/"; Changed: $Cp = "/usr/mbari/bin/cp"; $Mv = "/usr/mbari/bin/mv"; to: $Cp = "/bin/cp"; $Mv = "/bin/mv"; Changed: &calcEastNorth ($lat, $lon); # -- Calculate the easting adn northings -- to #&calcEastNorth ($lat, $lon); # -- Calculate the easting adn northings -- (I just commented it out because we do not have what we need to run this on the ship Changed: push (@East, $easting); push (@North, $northing); to: push (@East, 0); push (@North, 0); Commented out the following block: if ($DEBUG == 0) { unlink $wrkfile, $shipFile, $rovFile; } elsif ($DEBUG == 1) { print "Debugging on...see files in $prcPath\n"; } -------------------------------------------------------------- I just realized that the 3DTerrain.asp also needs some changes. I am copying over the terrains and got some wrong URL's when I tried to look at just the terrain. 3DTerrain.asp: ------------------------------------------- changed: $wrlBase = "http://search.shore.mbari.org/ARCHIVE/3Dreplay"; to $wrlBase = "http://fww.wf.mbari.org/ARCHIVE/3Dreplay"; changed: to changed: my $_dsn = "Server=godzilla;Database=expd;UID=everyone;PWD=guest;"; to: my $_dsn = "Server=alaskanwind;Database=expd;UID=everyone;PWD=guest;"; again, changed: to again changed: #my $_dsn = "Server=godzilla;Database=expd;UID=everyone;PWD=guest;"; to: my $_dsn = "Server=alaskanwind;Database=expd;UID=everyone;PWD=guest;"; again changed: #my $_dsn = "Server=godzilla;Database=expd;UID=everyone;PWD=guest;"; to: my $_dsn = "Server=alaskanwind;Database=expd;UID=everyone;PWD=guest;"; added: replaceString("$dir\\$file", "fww.wf.mbari.org", "localhost"); to: replaceString("$dir\\$file", "search.shore.mbari.org/ARCHIVE", "localhost"); replaceString("$dir\\$file", "menard.shore.mbari.org", "localhost"); replaceString("$dir\\$file", "expd.shore.mbari.org", "localhost"); replaceString("$dir\\$file", "fww.wf.mbari.org", "localhost"); Now there is a temporary directory where some of this work is done. I needed to relocate from \\tempest\tempbox\????? to d:\temp\3DReplay so I changed: my $tmpDir = "\\\\tempest\\tempbox\\" . GetFormValue('dir'); to: my $tmpDir = "d:\\temp"; Added: replaceString($file, "fww.wf.shore.mbari.org", "localhost"); to: replaceString($file, "search.shore.mbari.org/ARCHIVE", "localhost"); replaceString($file, "expd.shore.mbari.org", "localhost"); replaceString($file, "dods.shore.mbari.org", "localhost"); replaceString($file, "fww.wf.shore.mbari.org", "localhost"); Again added: replaceString($file, "fww.wf.shore.mbari.org", "localhost"); to: replaceString($file, "search.shore.mbari.org/ARCHIVE", "localhost"); replaceString($file, "expd.shore.mbari.org", "localhost"); replaceString($file, "dods.shore.mbari.org", "localhost"); replaceString($file, "fww.wf.shore.mbari.org", "localhost"); -------------------------------------------------------------------------------- I also need to edit dive.asp: Changed: $logrFile =~ s#http://mww.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/##; to: # $logrFile =~ s#http://mww.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/##; $logrFile =~ s#http://fww.wf.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/##; Changed: $logrFile =~ s#http://mww.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/##; to: #$logrFile =~ s#http://mww.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/##; $logrFile =~ s#http://fww.wf.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/##; Changed: $logrFile =~ s#http://mww.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/#
#; to: # $logrFile =~ s#http://mww.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/#
#; $logrFile =~ s#http://fww.wf.mbari.org/ARCHIVE/rovctd/.+/\d\d\d\d/\d\d\d/#
#; --------------------------------------------------------------------------------------------------------- To setup the client, you need to install the following: 1. Download Netscape Communicator 4.7 from Netscape 2. Install it: a. typical installation b. Uncheck all the boxes 3. Download CosmoPlayer from www.cosmosoftware.com 4. Close all browsers 5. Install by double-clicking it. a. Uncheck "Previewing in Cosmo Authoring Applications" b. Check Other(unsupported browsers) c. Check Netscape - Unsupported Version d. Find the plugins directory for netscape. e. Ignore the message about service pack 3. 6. Install geovrml by going to http://www.geovrml.org/1.0/download and clicking on "Click to install GeoVRML for Windows" to download it. 7. Close your browser(s). 8. Double click to install (let it install it to "Program Files" 9. Fire up netscape and try to browse a geovrml world... 10. Click on the small check box in the lower right hand corner of the Cosmo player and click on the "Keyboard tab". Then change the shift to "Turbo Mode and Continuous Seek" 7. ================================================================================================================================================================== OK, I needed a clean slate start here. I am in that place of not seeing the forest for the trees. Before the dive 1. A precruise is entered into the expedition database. What happens when a dive occurs on Wester Flyer? 1. Navproc is recording navigation data to a file YYYYDDDshipnavlogr.dat (at some point it looks like it is compressed to a .gz file) 2. Someone is manning the vicki station and snapping framegrabs which are then stored in a directory on beroe in the directory: /usr/people/vicki/YYYY/DDD/ As the frame grabs are taken, four files are created: - HH_MM_SS_FR.comment - HH_MM_SS_FR.jpg - HH_MM_SS_FR.overlay - HH_MM_SS_FR.rgb Where HH is hour, MM is minute, SS is second, and FR is frame number. 3. Navproc is recording rovctd data to a file YYYYDDDrovctdlogr.dat (at some point it looks like it is compressed to a .gz file). After the dive is complete. 1. A dive number and start and end times for the dive are entered into the postcruise page in the expedition database. 2. The popExpeditionData.pl script is run on a WindowsNT machine. 3. The vfcbydive.pl script is run on an SGI machine 4. The geopdvc.pl script is run on an WindowsNT machine. ?????????????????????'s 1. Are the navproc log files (YYYYDDDshipnavlogr.dat) readable for the current day or should we make a copy before trying to read them. 2. How are the navproc files compressed and when? USER'S GUIDE: 1. Precruise must be entered on shore into expedition database. 2. Copy database from Godzilla to alaskanwind. 3. Change URL's in ????? and ????? to point to fww.wf.mbari.org 4. Dive takes place. 5. After dive is complete, enter postcruise information (dive number, start and end times) at http://fww.wf.mbari.org/expd/log/postcruise.asp 6. Login to batray as wflogger (p/w 2020rov). 6.25 type "/usr/tiburon/mbari/local/bin/bash" 6.5. type "cd 3DReplay" 7. type "source .env_tibr" 8. type "export PATH=$PATH:/users/wflogger/3DReplay/bin:/mbari/local/bin" 9. type "rm -f /users/wflogger/3DReplay/.wfNAVmail" 10. type "/users/wflogger/3DReplay/bin/moveLogrFiles.perl NAV > /users/wflogger/3DReplay/.wfNAVmail 2>&1 11. type "cd /users/wflogger/3DReplay/nav/wfly/raw" 12. type "gzip *shipnavlogr.dat" 13. type "cd" 13.5 type "cd 3DReplay" 14. type "/users/wflogger/3DReplay/bin/processNav.perl >> /users/wflogger/3DReplay/.wfNAVmail 2>&1" 15. Login to wharfrat as ????? 16. Open up a command prompt window 17. type "d:" 18. type "cd ARCHIVE/nav/YYYY/tibr" 19. type "ftp batray.wf.mbari.org" 20. login as wflogger (p/w 2020rov) 21. type "cd 3DReplay/ARCHIVE/nav/2002/tibr" 22. type "mget *tibr.txt" (answer yes to each file) 22.1 type "lcd ../wfly" 22.1.5 type "cd ../wfly" 22.2 type "mget *wfly.txt" (answer yes to each file) 23. type "quit" (when FTPing is done). 24. Login to beroe using account vicki (p/w gmsvnow!) 25. type "bin/wfc2web Tiburon YYYYDDD" 26. go back to wharfrat's command prompt window. 27. type "d:" 28. type "cd expdlog\loads" 29. ------------------------------------------------- 24. type "cd expdlog/loads" 19. type "popExpeditionData.pl YYYYDDD YYYYDDD" or "popExpeditionData.pl last(nDays)"