From f7a7dfb934a2637795e69079066a5e8c0e9a7d4c Mon Sep 17 00:00:00 2001 From: David Walter Seikel Date: Fri, 19 Feb 2016 15:28:37 +1000 Subject: Run the docs through a spell checker. --- BUILD.txt | 2 +- README | 4 ++-- STANDARDS.txt | 10 +++++----- TODO | 8 ++++---- VISION.txt | 4 ++-- docs/BVJ.html | 10 +++++----- docs/ClientHamr/README.GuiLua | 4 ++-- docs/Croquet-integration.html | 8 ++++---- docs/Grid-data-flow.html | 2 +- docs/InworldAnimationEditor.html | 10 +++++----- docs/LSL-functions-implemented.html | 2 +- docs/LuaSL-New-scripting-engine.html | 36 ++++++++++++++++++------------------ docs/NGIW.Commands.html | 4 ++-- docs/NGIW.html | 8 ++++---- docs/Nails.html | 6 +++--- docs/OMG-WTF-BBQ.html | 2 +- docs/SledjHamr.html | 4 ++-- docs/SledjHamr/portals.txt | 4 ++-- killem.sh | 2 +- 19 files changed, 65 insertions(+), 65 deletions(-) diff --git a/BUILD.txt b/BUILD.txt index 67fc58c..3cac347 100644 --- a/BUILD.txt +++ b/BUILD.txt @@ -24,7 +24,7 @@ http://git.enlightenment.org/core/ It should all work under X window managers other than Enlightenment, though this has not been tested. The same applies to Linux distros -other than Ubuntu 12.04, that's the only one that hes been tested by me. +other than Ubuntu 12.04, that's the only one that has been tested by me. This is all still experimental, but I intend to get it to work under Mac OS X and Windows as well. diff --git a/README b/README index f04c431..66f6c16 100644 --- a/README +++ b/README @@ -259,7 +259,7 @@ audits, and less places for things to go wrong, or escape attention. Commonly used security and privacy protocols also tend to be built in modular ways, so we can make use of these things ourselves. -Pretty and skinable. +Pretty and skinnable. -------------------- The Enlightenment Foundation Libraries (EFL) where written to support @@ -553,7 +553,7 @@ IM client web browser We may have to deal with external web proggies, which may in the end be the best way. Except for MOAP. lol - Maybe later we can have a HTML to GuiLua translater module? Muahahaha + Maybe later we can have a HTML to GuiLua translator module? Muahahaha IAR & OAR file reader / writer loads up the file, sending it as nails commands diff --git a/STANDARDS.txt b/STANDARDS.txt index c20a258..ad0f4b5 100644 --- a/STANDARDS.txt +++ b/STANDARDS.txt @@ -36,15 +36,15 @@ On the other hand, we do want full functionality to be available to high level scripting languages. While Lua is the chosen language, there's been some success translating to and from Lua which may help us be more agnostic here. There's others working on translating things to Lua, -based on the theory that since it's the fastest scripting lagnuage, that +based on the theory that since it's the fastest scripting language, that might help speed up other scripting languages. In particular, translating LSL into Lua has so for been working well. -The parts that can be benchmarked soo far are very fast indeed, making +The parts that can be benchmarked so far are very fast indeed, making SL and OS look like the crippled snails that they are. Also, there's some evidence that we might be able to translate Lua into ECMAScript to run in web browsers, or run a Lua VM inside the ECMAScript VM. -Alledgedly, while the scripts run slower than LuaJIT can run them, they +Allegedly, while the scripts run slower than LuaJIT can run them, they are still quite fast compared to ECMAScript. We need this to support legacy web browsers, which decided long ago to standardise on ECMAScript. Alice loves her some JavaScript and node.js. @@ -64,7 +64,7 @@ wanted to use a different markup syntax. lol I solved this by installing a few markup systems, but the majority of it got written as what Drupal refers to as "simple formatting", which -basically allows the use of certain HTML tags, plus a few nicities. +basically allows the use of certain HTML tags, plus a few niceties. Eventually code started to get written by me, I put it on GitHub. At around the same time, I was busy transferring the Team Purple website @@ -91,7 +91,7 @@ that's not our main thrust for UI stuff. Sooooo, to avoid a lot of work, how about we just make the documentation static HTML web pages, and be done with it? Markup systems are just a way to make HTML more palatable to ordinary users, but ordinary users -are not writing the design documentiation and such. They all get +are not writing the design documentation and such. They all get translated to HTML, so let's just write raw HTML ourselves. Keep it simple, no fancy CSS or Javascript shit. So we avoid markup, we avoid having to translate from markup to markup, and we avoid the rendering diff --git a/TODO b/TODO index 4b23cb8..e340799 100644 --- a/TODO +++ b/TODO @@ -32,7 +32,7 @@ Runnr.c Starts or resumes the script running. If there's a message, push it as an argument. Expects the script to eventually call Runnr.receive(). Runnr.receive() - Does a Lua yeild, which will return once there is a message for this script. + Does a Lua yield, which will return once there is a message for this script. Runnr.send() Either send2script(SID, message), or send2server(message) @@ -201,7 +201,7 @@ More generic event system? There are other events flying around. BASIC event Just send it to LuaSL, it loops through all scripts, sending the event to them all if they have an event function in their current state. - REGISTER event Script has to register for these events with love. Can have filter. love loops through all regisitered "scripts", applies filters, sends events. + REGISTER event Script has to register for these events with love. Can have filter. love loops through all registered "scripts", applies filters, sends events. REQUEST event Script has to request an event from love. Can have a filter. Love sends event straight back to the script. Once. This covers the "wait for return value" case as well. events.return()? touch_start goes from extantz to LuaSL, via love. Most of the rest go from love to LuaSL I think. state_entry, state_exit are internal to LuaSL, timer should be. @@ -217,7 +217,7 @@ More generic event system? Sensor and no_sensor might be common, but you request those, your silly fault, they have filters though. llSensor(string name, string id, integer type, float range, float arc) llSensorRepeat(string name, string id, integer type, float range, float arc, float rate) - Event types can have filters? Events can be types? In otherwords, generic filter system, attach say filters to say type events, etc. + Event types can have filters? Events can be types? In other words, generic filter system, attach say filters to say type events, etc. Figure this all out as we go along. parseStream() / send2() both - @@ -530,7 +530,7 @@ I can use the llDialog() or purkle widgets, especially when you can rip them out of the main window to be their own windows. Once the files widget morphs into a proper inventory widget, I can use -it in meta-impy for inventory AND an asyncronous file requestor. +it in meta-impy for inventory AND an asynchronous file requester. Finally (I think) I could rip the meta-impy graphics pipeline a new one. diff --git a/VISION.txt b/VISION.txt index aef5719..7c353e6 100644 --- a/VISION.txt +++ b/VISION.txt @@ -2,7 +2,7 @@ I don't usually bother with even reading vision statements, and I'm a compulsive speed reader. Before I know I should not be reading something, it's too late, already read it. lol -SledjHamr is my lifes passion though, I've been interested in this sort +SledjHamr is my life's passion though, I've been interested in this sort of thing since the '80s and '90s. I've been thinking about how best to describe it, and it's turning into a sort of vision statement. So here it is. I'll try to avoid being too hand wavy and wanky. @@ -103,6 +103,6 @@ corporate computer. Other virtual world protocols will be supported, the SL / OS ones will be just a wrapper around the SledjHamr protocol. For direct SledjHamr to -SledjHamr things should be noticably faster and smoother. Obviously any +SledjHamr things should be noticeably faster and smoother. Obviously any particular protocol will be limited by that protocols own limits. diff --git a/docs/BVJ.html b/docs/BVJ.html index ce1a24e..a539705 100644 --- a/docs/BVJ.html +++ b/docs/BVJ.html @@ -66,7 +66,7 @@ implement animating link sets, and interaction with the sim physics.

Semantics

The semantics are that the "Hips" and the "RightUpLeg" of something are animated. The mapping from the BVJ file's "NAME" fields to the avatar skeleton is straightforward, the names in the BVJ are matched against the names of the skeleton components. Then the appropriate rotations and translations are applied frame by frame.

The point of the hierarchy is so that when a joint moves or rotates, it's children get carried along for the ride. When you turn the hips left in the sample BVJ, the whole body turns left.

-

Note that the "offset" value isn't actually used when animating avatars. The position of the hips and the angles are all used, but no attempt is made to match the skeleton bone lengths to the BVH segment lengths. So I propose that we can eliminate them or make them optional to reducing lag and file size. The offsets are useful in other tools because they define a skeleton that can be visualized.

+

Note that the "offset" value isn't actually used when animating avatars. The position of the hips and the angles are all used, but no attempt is made to match the skeleton bone lengths to the BVH segment lengths. So I propose that we can eliminate them or make them optional to reducing lag and file size. The offsets are useful in other tools because they define a skeleton that can be visualised.

 

Attachment points

The hierarchy portion of a BVJ is a fine place to express attachment points.

@@ -101,13 +101,13 @@ implement animating link sets, and interaction with the sim physics.

Animating Prims

BVH was defined with skeletons in mind. But, at first glance it seems that if there were some two (or more) prim object with a root prim named "Hips" and another prim named "RightUpLeg" we should be able to animate that link set using this same BVH/BVJ file.

The one issue is that segments in the BVH model are like vectors, they have a near end, a far end, a length, and they rotate about their near end. In particular bones have no width or depth, only length.

-

So I propose adding "PIVOT":[x,y,z] to define about what part of a prim the prim rotates when being animated. When omitted the center of the prim is used, and is equivalent to "PIVOT":[0,0,0]. The pivot ranges from -1 to 1 on each axis with -1 meaning the small end and 1 the large end. For example consider a cylinder, "PIVOT":[0,0,0.5] would rotate about the point midway between the center of the cylinder and the +Z face of the cylinder, i.e. half way up to the top. "PIVOT":[0,0,1] would make the cylinder act like a normal bone making up a skeleton.

+

So I propose adding "PIVOT":[x,y,z] to define about what part of a prim the prim rotates when being animated. When omitted the centre of the prim is used, and is equivalent to "PIVOT":[0,0,0]. The pivot ranges from -1 to 1 on each axis with -1 meaning the small end and 1 the large end. For example consider a cylinder, "PIVOT":[0,0,0.5] would rotate about the point midway between the centre of the cylinder and the +Z face of the cylinder, i.e. half way up to the top. "PIVOT":[0,0,1] would make the cylinder act like a normal bone making up a skeleton.

 

Animating Attached Prims

Things are interesting when I want to define an animation of my avy and an attachment to my avy. Suppose when applying an animation from a BVH or BVJ that I get to a joint named "tail" with a defined attachment point 103. If my avatar is wearing something at point 103, then search that object for a prim named "tail". If I find a prim named "tail" in the attachment then this joint's animation applies to that prim. And all the children of the "tail" joint in the animation are sought in the link set of the attachment.

 

Yet More Attachment Points

-

Why yes, that *does* mean attachments can have attachments, glad you asked. Suppose my tail has three bones, and the attachment point defined for the last bone is 104. I could attach the tail to point 103, and a pretty bow to point 104. The data model would be avy attachment point 103 has "thin neko tail with pink tip" attached, and avy attachment point 104 has "pretty bow" attached. But because point 104 is defined on a child joint of the joint with atachment point 103, the object "Pretty Bow" would move with a part of the tail, not with some random part of the avy.

+

Why yes, that *does* mean attachments can have attachments, glad you asked. Suppose my tail has three bones, and the attachment point defined for the last bone is 104. I could attach the tail to point 103, and a pretty bow to point 104. The data model would be avy attachment point 103 has "thin neko tail with pink tip" attached, and avy attachment point 104 has "pretty bow" attached. But because point 104 is defined on a child joint of the joint with attachment point 103, the object "Pretty Bow" would move with a part of the tail, not with some random part of the avy.

 

Sampling and Keyframing

The BVH file format was originally created for motion capture. So it defines animations by means of sampling. The same way a motion picture film samples the world 24 times a second making still photographs, the BVH captures the values on all the channels at regular points in time. But not all animations are created by motion capture, perhaps most are made with a keyframing animation system.

@@ -148,10 +148,10 @@ implement animating link sets, and interaction with the sim physics.

 

 

At This Time

-

The last enhancement I want to make is to add absolute time references. Consider using the BVJ file to define the animations of the hands on a analog clock. I would like to be able to express "At noon, all hands are pointing up." What this means is when invoking an animation we need to map from the Unix time to the animation's time. This is a linear mapping so two numbers are required, one expresses how many animation seconds elapse for each Unix second, the second specifies at what Unix time at which the animation time 0 occurs. There is a third number implied by looping animations. How long the animation is. Note a looping animation often begins to loop at some point after animation time 0 and ends before the largest animation time in the file. This is due to the types of interpolation used when keyframing. Linear interpolation requires two keyframes before a position can be known, quadratic 3, and cubic 4.

+

The last enhancement I want to make is to add absolute time references. Consider using the BVJ file to define the animations of the hands on a analogue clock. I would like to be able to express "At noon, all hands are pointing up." What this means is when invoking an animation we need to map from the Unix time to the animation's time. This is a linear mapping so two numbers are required, one expresses how many animation seconds elapse for each Unix second, the second specifies at what Unix time at which the animation time 0 occurs. There is a third number implied by looping animations. How long the animation is. Note a looping animation often begins to loop at some point after animation time 0 and ends before the largest animation time in the file. This is due to the types of interpolation used when keyframing. Linear interpolation requires two keyframes before a position can be known, quadratic 3, and cubic 4.

 

Tools to Make BVJ Files

-

Currently there are none, but see InworldAnimationEditor for my ideas. It should be obvious how to transform a BVH into a BVJ file that uses sampling. By looking at the rates of change of channels it is possible to discover inflection points and use them to synthesize a keyframe representation that is a close match to a set of samples.

+

Currently there are none, but see InworldAnimationEditor for my ideas. It should be obvious how to transform a BVH into a BVJ file that uses sampling. By looking at the rates of change of channels it is possible to discover inflection points and use them to synthesise a keyframe representation that is a close match to a set of samples.

And, of course, I want to make this file format editable in-world using nice GUI and 3D editing tools. Basically a clone of QAvimator in the client.

 

Client to Client

diff --git a/docs/ClientHamr/README.GuiLua b/docs/ClientHamr/README.GuiLua index ba0bf7d..8b62860 100644 --- a/docs/ClientHamr/README.GuiLua +++ b/docs/ClientHamr/README.GuiLua @@ -443,9 +443,9 @@ object = eo_add(EVAS_OBJ_LINE_CLASS, canvas); Eo.h -> EO_DEFINE_CLASS is a macro that basically wraps eo_class_new(), and returns it's result. So Eo_Class is the type of a class, but it's A) opaque, B) deprecated! -It includes a pointor to the Eo_Class_Description, which includes the +It includes a pointer to the Eo_Class_Description, which includes the actual name. I'm not seeing anywhere the names of the get/set -paramaters being passed into the system, or any way to look up a class +parameters being passed into the system, or any way to look up a class based on name. Not even a way to get to the public Eo_Class_Description from the opaque Eo_Class. diff --git a/docs/Croquet-integration.html b/docs/Croquet-integration.html index 1b4d072..b94ab09 100644 --- a/docs/Croquet-integration.html +++ b/docs/Croquet-integration.html @@ -62,7 +62,7 @@ plugin that leverages all of Squeak/Croquet's functionality to enhance some other viewer shouldn't be sneered at. Right now, the SL viewer (for example) barely provides access to raw mouse coordinates for UV tracking on a texture (the current media plugin scenario), but there's no reason -why any arbitrary event or internet packet couldn't be intercepted and +why any arbitrary event or Internet packet couldn't be intercepted and shunted off to squeak for pre/post processing. http://wiki.secondlife.com/wiki/User:Saijanai_Kuhn/Plugins_discussion#Proposed_Extension_to_Media_Plugin @@ -87,7 +87,7 @@ http://www.metanomics.net/ Start interacting with internal viewer events, and you can leverage physics/graphics creation/etc from the Squeak/Croquet side, and merge it directly into a local SL instance for custom puppeteering with the -possiblilty of doing a P2P collaboration mechanima where individual +possibility of doing a P2P collaboration mechanima where individual avatars can be controlled by a single machine using a script and/or timeline control interface. The resulting avatar activity can be "filmed" for mechanima, or could be uploaded to a central server for @@ -97,8 +97,8 @@ http://wiki.secondlife.com/wiki/User:Saijanai_Kuhn/Plugins_discussion#Puppeteeri Instead of using 2D projections, you could also leverage the 3D portal -system of Croquet to inject 3D cenes from Croquet into a given virtual -world viewer, and either maintain a backk-end P2P connection between +system of Croquet to inject 3D scenes from Croquet into a given virtual +world viewer, and either maintain a back-end P2P connection between participants, or shoot the composite scene to a central server in some fashion using the existing virtual world protocols. diff --git a/docs/Grid-data-flow.html b/docs/Grid-data-flow.html index 5341ce9..1f3ff93 100644 --- a/docs/Grid-data-flow.html +++ b/docs/Grid-data-flow.html @@ -126,7 +126,7 @@