Voxelizer1
Sep 17, 2010
Post id: 305323 Report Item

How can I sample the value of a projection, 2D or 3D, in space? The projection's outColor is the value at 0,0 only.

This is essentially the same question as this one from 2002, but the answer isn't clear, possibly because of the forum conversion --

http://www.creativecrash.com/forums/animation/topics/animating-objects-with-texture-

Here's an example:








I've connected the projection outColor to each object's scale, but the objects only get the value at the texture's UV origin.

Hypergraph
Dashboard_avatar
Sep 17, 2010
Post id: 305324 Report Item

Well elementary you can not but a plane is trivial so particle sampler can do it as you dont need a projection in this case. Thing is maya is missing one node that is the 3d sampler.  You can go the tedious route and use colorAtPoint but thas slow.

You should really make a node for this if you need it its sortof trivial if youve ever used maya api.

example scene follows drop in script editor and execute (investigate the graph):












1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60


proc string createTurncatedPyramid(){
// fuction to add readable verbosity
// and modularity to main function
// name should be self explanatory
$cube=`polyCube -w 1 -h 1 -d 1 -sx 1 -sy 1 -sz 1 -ax 0 1 0 -cuv 4 -ch 0`;
scale -r -p 0cm 0.5cm 0cm 0.7 0.7 0.7 ($cube[0]+".vtx[2:5]");
move -r 0 0.5 0 ($cube[0]+".vtx[ * ]");
return $cube[0];
}
 
proc string[] createParticleFieldWithUVScale(int $u,int $v){
$particle=`particle -ll 0 0 0 -ur ($u-1) 0 ($v-1) -grs 1 -c 1`;
addAttr -ln uPP -dt doubleArray $particle[1];
addAttr -ln uPP0 -dt doubleArray $particle[1];
addAttr -ln vPP -dt doubleArray $particle[1];
addAttr -ln vPP0 -dt doubleArray $particle[1];
addAttr -ln returnValuePP -dt vectorArray $particle[1];
addAttr -ln returnValuePP0 -dt vectorArray $particle[1];
addAttr -ln scalePP -dt vectorArray $particle[1];
addAttr -ln scalePP0 -dt vectorArray $particle[1];
//init uPP and vPP -> set to PP0 So starting default is set
dynExpression -s ("particleShape1.vPP0=(id%"+$v+".0)/"+($v-1)+";\r\n"+
"particleShape1.uPP0=(id-(id%"+($v)+"))/"+($v*($u-1))+";")
-c $particle[1];
//we can now clear it its written down, then reset sim
dynExpression -s "" -c $particle[1];
currentTime -10;
currentTime 1 ;
return $particle;
 
}
 
{
$object = createTurncatedPyramid();
$particle = createParticleFieldWithUVScale(20,20);
$noise = `shadingNode -asTexture noise`;
particleInstancer -addObject -object $object
-position worldPosition -scale scalePP
$particle[1];
dynExpression -s "particleShape1.scalePP=<<1,mag(returnValuePP)*3+0.02,1>>;"
-rbd $particle[1];
dynExpression -s "particleShape1.scalePP=<<1,mag(returnValuePP)*3+0.02,1>>;"
-c $particle[1];
$am = `arrayMapper -target $particle[0]
-destAttr returnValuePP
-inputU uPP -inputV vPP`;
connectAttr -f ($noise+".message") ($am[0]+".computeNode");
connectAttr -f ($noise+".outColor") ($am[0]+".computeNodeColor");
setAttr ($noise+".frequencyRatio") 2;
setAttr ($noise+".frequency") 1.5;
setAttr ($noise+".depthMax") 2;
setAttr ($noise+".amplitude") 0.5;
setAttr ($noise+".density") 0.6;
setAttr ($noise+".spottyness") 1.5;
connectAttr -f ("time1.outTime") ($noise + ".time");
$conv = `listConnections -d on -s off ("time1.outTime")`;
 
setAttr ($conv[0] + ".conversionFactor") 0.001;
hide $object;
}



Dashboard_avatar
Sep 18, 2010
Post id: 305328 Report Item

Oh one thing the noise maya sees and the noise mentalray generates is not the same

Voxelizer1
Sep 20, 2010
Post id: 305351 Report Item

Interesting, thanks for the script. I'm attaching a playblast of its output for the audience.

I see that you're using the arrayMapper node to return the value of a texture at any point given its UV coordinates, and then connecting that to a custom attribute called returnValuePP on a particle system. So in this case, since you're using a particle grid, which comes with its own UV mapping, that solves the problem of getting the UV coordinates.

So if you knew the origin of a projection's UV map, theoretically with a few math nodes you could get the UV coordinates of any point in space without needing an intermediate object (like a nurbs plane) and a bunch of colorAtPoint commands. Then you could feed that back into an arrayMapper and connect the output to whatever you liked... Does this sound reasonable?








Voxelizer1
Sep 20, 2010
Post id: 305352 Report Item

I've just found another example of your technique here, with in-depth explanations of each step, for those who are interested...

http://www.romainrico.com/archives/277
 

Voxelizer1
Sep 20, 2010
Post id: 305358 Report Item

Using a closestPointOnMesh node I can get the UV coordinates of a given point (I can also do it with a samplerInfo node), but the arrayMapper node's uValuePP and vValuePP only accept double arrays, and the CPOM and samplerInfo outputs are floats. Is there some way besides an expression to connect these two, maybe through some kind of datatype conversion? Will that work the way I imagine? Of course the arrayMapper's outColorPP is named that because it assumes it will be used with particles, but is that necessary?

Hypershade2
Voxelizer1
Sep 20, 2010
Post id: 305359 Report Item

Continuing the search, I've found a cutsom Python plugin by Pascal Loef at Double Negative called sppl_colorAtPointNode:
http://forums.cgsociety.org/showpost.php?p=4956975&postcount=8
It's a wrapper of MRenderUtil::sampleShadingNetwork, and rather slow, but it does the job. It also comes with some test scripts -- I had to modify them a bit to get them to work, though I may have installed them wrong... in the 2D test script there's a call to pypl_colorAtPoint() that I believe should be sppl_colorAtPoint(). At any rate, when I changed it and ran the test, this is what I get:





 

But again, it is rather slow compared to Joojaa's particle method. Isn't there any way to get to the color value directly using an arrayMapper without using particles?

Dashboard_avatar
Sep 21, 2010
Post id: 305365 Report Item

>> Then you could feed that back into an arrayMapper and connect the output to whatever you liked... Does this sound reasonable?

Yes but you could use the particle expression to do that with minimal overhead. And less work for the sampler info to calculate.

>> http://www.romainrico.com/archives/277

Yes it just so happens that this is the remade version of tutorial  (the script you see above is a partial rewrite of a script i sent him when i suggested a faster solution he just reworked it into the tutorial). ;) The original used color at point.

>> Isn't there any way to get to the color value directly using an arrayMapper without using particles?

Well see thing is the reason MRenderUtil::sampleShadingNetwork, becomes slow is that you call it once for everything in a normal node network. But see, the act of calling one MRenderUtil::sampleShadingNetwork is about as slow as calling a 100 separate calls in ONE call to MRenderUtil::sampleShadingNetwork. So the overhead iof initializing MRenderUtil::sampleShadingNetwork is pretty enormous. So you should get some speed gains by piping many outputs form one node.

Second reason:

particles are ordered as a data structure much more optimally for updating, so theres some net gain to this and the instancer drawing.

But theres no real reason why you can not use array mapper with normal objects

PS: ok i get thet you want to use aprojection but for your demo theres not much point as its just waste of time, you can do the same with the uv placement node and its heaps faster.

Voxelizer1
Sep 21, 2010
Post id: 305376 Report Item

Okay, I'm convinced -- thanks very much, Joojaa. One last question: can this be done with particles at arbitrary locations that aren't in a particle grid, maybe using a UV placement node as the UV scale reference? That's the reason I'd been trying to avoid particles, I want to be able to sample arbitrary points in space.

Dashboard_avatar
Sep 22, 2010
Post id: 305386 Report Item

>> with particles at arbitrary locations that aren't in a particle grid

yes off course. Your just limited to 2 dimensional sampling.*

*alas you CAN do 3d sampling with a 2d sampler, as you can pack coordinates withing a single double but dangerous but if you know what your doing and know the constraints then why not. However since your doing a projection just calcualte a projection for the particles UV value each frame.