From f7a7dfb934a2637795e69079066a5e8c0e9a7d4c Mon Sep 17 00:00:00 2001 From: David Walter Seikel Date: Fri, 19 Feb 2016 15:28:37 +1000 Subject: Run the docs through a spell checker. --- docs/BVJ.html | 10 +++++----- docs/ClientHamr/README.GuiLua | 4 ++-- docs/Croquet-integration.html | 8 ++++---- docs/Grid-data-flow.html | 2 +- docs/InworldAnimationEditor.html | 10 +++++----- docs/LSL-functions-implemented.html | 2 +- docs/LuaSL-New-scripting-engine.html | 36 ++++++++++++++++++------------------ docs/NGIW.Commands.html | 4 ++-- docs/NGIW.html | 8 ++++---- docs/Nails.html | 6 +++--- docs/OMG-WTF-BBQ.html | 2 +- docs/SledjHamr.html | 4 ++-- docs/SledjHamr/portals.txt | 4 ++-- 13 files changed, 50 insertions(+), 50 deletions(-) (limited to 'docs') diff --git a/docs/BVJ.html b/docs/BVJ.html index ce1a24e..a539705 100644 --- a/docs/BVJ.html +++ b/docs/BVJ.html @@ -66,7 +66,7 @@ implement animating link sets, and interaction with the sim physics.

Semantics

The semantics are that the "Hips" and the "RightUpLeg" of something are animated. The mapping from the BVJ file's "NAME" fields to the avatar skeleton is straightforward, the names in the BVJ are matched against the names of the skeleton components. Then the appropriate rotations and translations are applied frame by frame.

The point of the hierarchy is so that when a joint moves or rotates, it's children get carried along for the ride. When you turn the hips left in the sample BVJ, the whole body turns left.

-

Note that the "offset" value isn't actually used when animating avatars. The position of the hips and the angles are all used, but no attempt is made to match the skeleton bone lengths to the BVH segment lengths. So I propose that we can eliminate them or make them optional to reducing lag and file size. The offsets are useful in other tools because they define a skeleton that can be visualized.

+

Note that the "offset" value isn't actually used when animating avatars. The position of the hips and the angles are all used, but no attempt is made to match the skeleton bone lengths to the BVH segment lengths. So I propose that we can eliminate them or make them optional to reducing lag and file size. The offsets are useful in other tools because they define a skeleton that can be visualised.

 

Attachment points

The hierarchy portion of a BVJ is a fine place to express attachment points.

@@ -101,13 +101,13 @@ implement animating link sets, and interaction with the sim physics.

Animating Prims

BVH was defined with skeletons in mind. But, at first glance it seems that if there were some two (or more) prim object with a root prim named "Hips" and another prim named "RightUpLeg" we should be able to animate that link set using this same BVH/BVJ file.

The one issue is that segments in the BVH model are like vectors, they have a near end, a far end, a length, and they rotate about their near end. In particular bones have no width or depth, only length.

-

So I propose adding "PIVOT":[x,y,z] to define about what part of a prim the prim rotates when being animated. When omitted the center of the prim is used, and is equivalent to "PIVOT":[0,0,0]. The pivot ranges from -1 to 1 on each axis with -1 meaning the small end and 1 the large end. For example consider a cylinder, "PIVOT":[0,0,0.5] would rotate about the point midway between the center of the cylinder and the +Z face of the cylinder, i.e. half way up to the top. "PIVOT":[0,0,1] would make the cylinder act like a normal bone making up a skeleton.

+

So I propose adding "PIVOT":[x,y,z] to define about what part of a prim the prim rotates when being animated. When omitted the centre of the prim is used, and is equivalent to "PIVOT":[0,0,0]. The pivot ranges from -1 to 1 on each axis with -1 meaning the small end and 1 the large end. For example consider a cylinder, "PIVOT":[0,0,0.5] would rotate about the point midway between the centre of the cylinder and the +Z face of the cylinder, i.e. half way up to the top. "PIVOT":[0,0,1] would make the cylinder act like a normal bone making up a skeleton.

 

Animating Attached Prims

Things are interesting when I want to define an animation of my avy and an attachment to my avy. Suppose when applying an animation from a BVH or BVJ that I get to a joint named "tail" with a defined attachment point 103. If my avatar is wearing something at point 103, then search that object for a prim named "tail". If I find a prim named "tail" in the attachment then this joint's animation applies to that prim. And all the children of the "tail" joint in the animation are sought in the link set of the attachment.

 

Yet More Attachment Points

-

Why yes, that *does* mean attachments can have attachments, glad you asked. Suppose my tail has three bones, and the attachment point defined for the last bone is 104. I could attach the tail to point 103, and a pretty bow to point 104. The data model would be avy attachment point 103 has "thin neko tail with pink tip" attached, and avy attachment point 104 has "pretty bow" attached. But because point 104 is defined on a child joint of the joint with atachment point 103, the object "Pretty Bow" would move with a part of the tail, not with some random part of the avy.

+

Why yes, that *does* mean attachments can have attachments, glad you asked. Suppose my tail has three bones, and the attachment point defined for the last bone is 104. I could attach the tail to point 103, and a pretty bow to point 104. The data model would be avy attachment point 103 has "thin neko tail with pink tip" attached, and avy attachment point 104 has "pretty bow" attached. But because point 104 is defined on a child joint of the joint with attachment point 103, the object "Pretty Bow" would move with a part of the tail, not with some random part of the avy.

 

Sampling and Keyframing

The BVH file format was originally created for motion capture. So it defines animations by means of sampling. The same way a motion picture film samples the world 24 times a second making still photographs, the BVH captures the values on all the channels at regular points in time. But not all animations are created by motion capture, perhaps most are made with a keyframing animation system.

@@ -148,10 +148,10 @@ implement animating link sets, and interaction with the sim physics.

 

 

At This Time

-

The last enhancement I want to make is to add absolute time references. Consider using the BVJ file to define the animations of the hands on a analog clock. I would like to be able to express "At noon, all hands are pointing up." What this means is when invoking an animation we need to map from the Unix time to the animation's time. This is a linear mapping so two numbers are required, one expresses how many animation seconds elapse for each Unix second, the second specifies at what Unix time at which the animation time 0 occurs. There is a third number implied by looping animations. How long the animation is. Note a looping animation often begins to loop at some point after animation time 0 and ends before the largest animation time in the file. This is due to the types of interpolation used when keyframing. Linear interpolation requires two keyframes before a position can be known, quadratic 3, and cubic 4.

+

The last enhancement I want to make is to add absolute time references. Consider using the BVJ file to define the animations of the hands on a analogue clock. I would like to be able to express "At noon, all hands are pointing up." What this means is when invoking an animation we need to map from the Unix time to the animation's time. This is a linear mapping so two numbers are required, one expresses how many animation seconds elapse for each Unix second, the second specifies at what Unix time at which the animation time 0 occurs. There is a third number implied by looping animations. How long the animation is. Note a looping animation often begins to loop at some point after animation time 0 and ends before the largest animation time in the file. This is due to the types of interpolation used when keyframing. Linear interpolation requires two keyframes before a position can be known, quadratic 3, and cubic 4.

 

Tools to Make BVJ Files

-

Currently there are none, but see InworldAnimationEditor for my ideas. It should be obvious how to transform a BVH into a BVJ file that uses sampling. By looking at the rates of change of channels it is possible to discover inflection points and use them to synthesize a keyframe representation that is a close match to a set of samples.

+

Currently there are none, but see InworldAnimationEditor for my ideas. It should be obvious how to transform a BVH into a BVJ file that uses sampling. By looking at the rates of change of channels it is possible to discover inflection points and use them to synthesise a keyframe representation that is a close match to a set of samples.

And, of course, I want to make this file format editable in-world using nice GUI and 3D editing tools. Basically a clone of QAvimator in the client.

 

Client to Client

diff --git a/docs/ClientHamr/README.GuiLua b/docs/ClientHamr/README.GuiLua index ba0bf7d..8b62860 100644 --- a/docs/ClientHamr/README.GuiLua +++ b/docs/ClientHamr/README.GuiLua @@ -443,9 +443,9 @@ object = eo_add(EVAS_OBJ_LINE_CLASS, canvas); Eo.h -> EO_DEFINE_CLASS is a macro that basically wraps eo_class_new(), and returns it's result. So Eo_Class is the type of a class, but it's A) opaque, B) deprecated! -It includes a pointor to the Eo_Class_Description, which includes the +It includes a pointer to the Eo_Class_Description, which includes the actual name. I'm not seeing anywhere the names of the get/set -paramaters being passed into the system, or any way to look up a class +parameters being passed into the system, or any way to look up a class based on name. Not even a way to get to the public Eo_Class_Description from the opaque Eo_Class. diff --git a/docs/Croquet-integration.html b/docs/Croquet-integration.html index 1b4d072..b94ab09 100644 --- a/docs/Croquet-integration.html +++ b/docs/Croquet-integration.html @@ -62,7 +62,7 @@ plugin that leverages all of Squeak/Croquet's functionality to enhance some other viewer shouldn't be sneered at. Right now, the SL viewer (for example) barely provides access to raw mouse coordinates for UV tracking on a texture (the current media plugin scenario), but there's no reason -why any arbitrary event or internet packet couldn't be intercepted and +why any arbitrary event or Internet packet couldn't be intercepted and shunted off to squeak for pre/post processing. http://wiki.secondlife.com/wiki/User:Saijanai_Kuhn/Plugins_discussion#Proposed_Extension_to_Media_Plugin @@ -87,7 +87,7 @@ http://www.metanomics.net/ Start interacting with internal viewer events, and you can leverage physics/graphics creation/etc from the Squeak/Croquet side, and merge it directly into a local SL instance for custom puppeteering with the -possiblilty of doing a P2P collaboration mechanima where individual +possibility of doing a P2P collaboration mechanima where individual avatars can be controlled by a single machine using a script and/or timeline control interface. The resulting avatar activity can be "filmed" for mechanima, or could be uploaded to a central server for @@ -97,8 +97,8 @@ http://wiki.secondlife.com/wiki/User:Saijanai_Kuhn/Plugins_discussion#Puppeteeri Instead of using 2D projections, you could also leverage the 3D portal -system of Croquet to inject 3D cenes from Croquet into a given virtual -world viewer, and either maintain a backk-end P2P connection between +system of Croquet to inject 3D scenes from Croquet into a given virtual +world viewer, and either maintain a back-end P2P connection between participants, or shoot the composite scene to a central server in some fashion using the existing virtual world protocols. diff --git a/docs/Grid-data-flow.html b/docs/Grid-data-flow.html index 5341ce9..1f3ff93 100644 --- a/docs/Grid-data-flow.html +++ b/docs/Grid-data-flow.html @@ -126,7 +126,7 @@