Why do different render engines generate different z pass?Cycles generates distorted depthPrecision of Z-coordinateMake an object's visibility controlled by another object or an emptyCreating depth of field from Beauty+Depth passes (rendered in another software)data formatting of Z-bufferIs it possible to create a render layer with no depth of field?Z-Buffer rendering issuesHow to get the right values for Z depth renderingDepth in meters for rendered imagesUninterpretablity in depth maps: why the pixel values do not necessarily represent a valid distance?Z-Buffer render gives unexpected resultsCycles generates distorted depthRendering depth map that is linear, aliased, normalized

How can I write humor as character trait?

How do you make your own symbol when Detexify fails?

Angel of Condemnation - Exile creature with second ability

Does the Linux kernel need a file system to run?

Has any country ever had 2 former presidents in jail simultaneously?

Can a college of swords bard use blade flourish on an OA from dissonant whispers?

Mimic lecturing on blackboard, facing audience

Does an advisor owe his/her student anything? Will an advisor keep a PhD student only out of pity?

When were female captains banned from Starfleet?

Isn't the "if" redundant here?

Open a doc from terminal, but not by its name

It grows, but water kills it

Why does a simple loop result in ASYNC_NETWORK_IO waits?

Fear of getting stuck on one programming language / technology that is not used in my country

A binary search solution to 3Sum

How to cover method return statement in Apex Class?

Is there a RAID 0 Equivalent for RAM?

What is going on with 'gets(stdin)' on the site coderbyte?

Why did the EU agree to delay the Brexit deadline?

How do apertures which seem too large to physically fit work?

Make a Bowl of Alphabet Soup

Yosemite Fire Rings - What to Expect?

Why is it that I can sometimes guess the next note?

How to align my equation to left?



Why do different render engines generate different z pass?


Cycles generates distorted depthPrecision of Z-coordinateMake an object's visibility controlled by another object or an emptyCreating depth of field from Beauty+Depth passes (rendered in another software)data formatting of Z-bufferIs it possible to create a render layer with no depth of field?Z-Buffer rendering issuesHow to get the right values for Z depth renderingDepth in meters for rendered imagesUninterpretablity in depth maps: why the pixel values do not necessarily represent a valid distance?Z-Buffer render gives unexpected resultsCycles generates distorted depthRendering depth map that is linear, aliased, normalized













5












$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question









$endgroup$











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57















5












$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question









$endgroup$











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57













5












5








5





$begingroup$


I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render










share|improve this question









$endgroup$




I've been using Blender to generate depth maps using z pass. I notice that the z pass generated by different render engines are different, which made me a bit confused. My feeling is that the z pass generated by Cycle render denotes distance from a given pixel to the camera center while Blender render generates an orthogonal distance to the camera plane. Do I understand it correctly? Is so, is there a way to change such behavior for both render modes?



As an example, the bottom area of the model is a flat surface, to which the camera is perpendicularly pointed. Below are the nodes I use to normalize and visualize the z pass data (Viewer node is used to save the depth map).



Nodes to normalize and view



Z pass with Cycles render:



Z pass with Cycles render



Z pass with blender render (everywhere same value in the bottom part):



Z pass with blender render







cycles rendering blender-render render-passes






share|improve this question













share|improve this question











share|improve this question




share|improve this question










asked Mar 12 at 16:12









DingLuoDingLuo

263




263











  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57
















  • $begingroup$
    Read this related link: Precision of z coordinate
    $endgroup$
    – cegaton
    Mar 12 at 17:54










  • $begingroup$
    Also related: Cycles generates distorted depth
    $endgroup$
    – cegaton
    Mar 12 at 17:57















$begingroup$
Read this related link: Precision of z coordinate
$endgroup$
– cegaton
Mar 12 at 17:54




$begingroup$
Read this related link: Precision of z coordinate
$endgroup$
– cegaton
Mar 12 at 17:54












$begingroup$
Also related: Cycles generates distorted depth
$endgroup$
– cegaton
Mar 12 at 17:57




$begingroup$
Also related: Cycles generates distorted depth
$endgroup$
– cegaton
Mar 12 at 17:57










1 Answer
1






active

oldest

votes


















3












$begingroup$

When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



  • You may want more detail at close range where image focus is likely to reside.

  • The scene may require more detail at large distances if you are rendering a landscape or distant view

  • You may want to use it for a mist pass requiring details at a medium range.

As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






share|improve this answer











$endgroup$












    Your Answer





    StackExchange.ifUsing("editor", function ()
    return StackExchange.using("mathjaxEditing", function ()
    StackExchange.MarkdownEditor.creationCallbacks.add(function (editor, postfix)
    StackExchange.mathjaxEditing.prepareWmdForMathJax(editor, postfix, [["$", "$"], ["\\(","\\)"]]);
    );
    );
    , "mathjax-editing");

    StackExchange.ready(function()
    var channelOptions =
    tags: "".split(" "),
    id: "502"
    ;
    initTagRenderer("".split(" "), "".split(" "), channelOptions);

    StackExchange.using("externalEditor", function()
    // Have to fire editor after snippets, if snippets enabled
    if (StackExchange.settings.snippets.snippetsEnabled)
    StackExchange.using("snippets", function()
    createEditor();
    );

    else
    createEditor();

    );

    function createEditor()
    StackExchange.prepareEditor(
    heartbeatType: 'answer',
    autoActivateHeartbeat: false,
    convertImagesToLinks: false,
    noModals: true,
    showLowRepImageUploadWarning: true,
    reputationToPostImages: null,
    bindNavPrevention: true,
    postfix: "",
    imageUploader:
    brandingHtml: "Powered by u003ca class="icon-imgur-white" href="https://imgur.com/"u003eu003c/au003e",
    contentPolicyHtml: "User contributions licensed under u003ca href="https://creativecommons.org/licenses/by-sa/3.0/"u003ecc by-sa 3.0 with attribution requiredu003c/au003e u003ca href="https://stackoverflow.com/legal/content-policy"u003e(content policy)u003c/au003e",
    allowUrls: true
    ,
    onDemand: true,
    discardSelector: ".discard-answer"
    ,immediatelyShowMarkdownHelp:true
    );



    );













    draft saved

    draft discarded


















    StackExchange.ready(
    function ()
    StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fblender.stackexchange.com%2fquestions%2f134122%2fwhy-do-different-render-engines-generate-different-z-pass%23new-answer', 'question_page');

    );

    Post as a guest















    Required, but never shown

























    1 Answer
    1






    active

    oldest

    votes








    1 Answer
    1






    active

    oldest

    votes









    active

    oldest

    votes






    active

    oldest

    votes









    3












    $begingroup$

    When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



    It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



    There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



    It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



    • You may want more detail at close range where image focus is likely to reside.

    • The scene may require more detail at large distances if you are rendering a landscape or distant view

    • You may want to use it for a mist pass requiring details at a medium range.

    As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



    If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



    You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



    Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






    share|improve this answer











    $endgroup$

















      3












      $begingroup$

      When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



      It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



      There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



      It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



      • You may want more detail at close range where image focus is likely to reside.

      • The scene may require more detail at large distances if you are rendering a landscape or distant view

      • You may want to use it for a mist pass requiring details at a medium range.

      As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



      If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



      You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



      Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






      share|improve this answer











      $endgroup$















        3












        3








        3





        $begingroup$

        When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



        It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



        There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



        It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



        • You may want more detail at close range where image focus is likely to reside.

        • The scene may require more detail at large distances if you are rendering a landscape or distant view

        • You may want to use it for a mist pass requiring details at a medium range.

        As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



        If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



        You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



        Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.






        share|improve this answer











        $endgroup$



        When rendering a Z pass you are essentially creating a depth map from the camera point of view. The issue here is there is potentially infinite range of distances to represent by colors and only 256 shades of gray available to map them to, in a traditional 8 bit image you have .



        It can go from zero close to the camera (unlikely there is something this close) to whichever visible object is most most distant. But there may also be a sky or "background" at a theoretical infinite distance.



        There are several possible ways to map these shades of grey to the distance progression each with its own advantages.



        It can be a linear mapping where detail is distributed evenly across all image, but here may also be logarithmic mappings, emphasizing detail at certain parts of the picture.



        • You may want more detail at close range where image focus is likely to reside.

        • The scene may require more detail at large distances if you are rendering a landscape or distant view

        • You may want to use it for a mist pass requiring details at a medium range.

        As far as I know would expect both Cycles and Blender Render to use the same "true distance to sensor", not a virtual orthographic plane passing through the sensor, but I may be wrong.



        If that is indeed the case or you require a specific color progression or custom mapping of values you may construct your own "an artificial Z pass".



        You can do so by making a basic emission shader with a circular black to white gradient mapped to the camera object.



        Moving the camera should update the position. You can scale the gradient as desired to accommodate your desired distance range, and drive it through a Color Ramp for a non linear progression.







        share|improve this answer














        share|improve this answer



        share|improve this answer








        edited Mar 12 at 18:22

























        answered Mar 12 at 16:58









        Duarte Farrajota RamosDuarte Farrajota Ramos

        34.1k53981




        34.1k53981



























            draft saved

            draft discarded
















































            Thanks for contributing an answer to Blender Stack Exchange!


            • Please be sure to answer the question. Provide details and share your research!

            But avoid


            • Asking for help, clarification, or responding to other answers.

            • Making statements based on opinion; back them up with references or personal experience.

            Use MathJax to format equations. MathJax reference.


            To learn more, see our tips on writing great answers.




            draft saved


            draft discarded














            StackExchange.ready(
            function ()
            StackExchange.openid.initPostLogin('.new-post-login', 'https%3a%2f%2fblender.stackexchange.com%2fquestions%2f134122%2fwhy-do-different-render-engines-generate-different-z-pass%23new-answer', 'question_page');

            );

            Post as a guest















            Required, but never shown





















































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown

































            Required, but never shown














            Required, but never shown












            Required, but never shown







            Required, but never shown







            -blender-render, cycles, render-passes, rendering

            Popular posts from this blog

            Identify plant with long narrow paired leaves and reddish stems Planned maintenance scheduled April 17/18, 2019 at 00:00UTC (8:00pm US/Eastern) Announcing the arrival of Valued Associate #679: Cesar Manara Unicorn Meta Zoo #1: Why another podcast?What is this plant with long sharp leaves? Is it a weed?What is this 3ft high, stalky plant, with mid sized narrow leaves?What is this young shrub with opposite ovate, crenate leaves and reddish stems?What is this plant with large broad serrated leaves?Identify this upright branching weed with long leaves and reddish stemsPlease help me identify this bulbous plant with long, broad leaves and white flowersWhat is this small annual with narrow gray/green leaves and rust colored daisy-type flowers?What is this chilli plant?Does anyone know what type of chilli plant this is?Help identify this plant

            fontconfig warning: “/etc/fonts/fonts.conf”, line 100: unknown “element blank” The 2019 Stack Overflow Developer Survey Results Are In“tar: unrecognized option --warning” during 'apt-get install'How to fix Fontconfig errorHow do I figure out which font file is chosen for a system generic font alias?Why are some apt-get-installed fonts being ignored by fc-list, xfontsel, etc?Reload settings in /etc/fonts/conf.dTaking 30 seconds longer to boot after upgrade from jessie to stretchHow to match multiple font names with a single <match> element?Adding a custom font to fontconfigRemoving fonts from fontconfig <match> resultsBroken fonts after upgrading Firefox ESR to latest Firefox

            Shilpa Shastras Contents Description In painting In carpentry In metallurgy Shilpa Shastra education in ancient India Treatises on Shilpa Shastras See also References Further reading External links Navigation menueOverviewTraditions of the Indian Craftsman251930242ŚilpinŚilpiniTraditions of the Indian CraftsmanThe Technique of Wall Painting in Ancient IndiaEssay on the Architecture of the HindusThe Journal of the Society of Arts10.1007/s11837-998-0378-3The role of India in the diffusion of early culturesTraditions of the Indian CraftsmanAn Encyclopedia of Hindu ArchitectureBibliography of Vastu Shastra Literature, 1834-2009The Technique of Wall Painting in Ancient India4483067Les lapidaires indiens