Why Does It Take So Long To Load Figures? ("Solved", with GUIDE)

1235

Comments

  • mindsongmindsong Posts: 1,701
    edited August 2021

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    I know, I know :) I know that the devs are aware of the problem, and surely want to fix it. I just can't stand the practice of invalidating, minimizing or dismissing the legitimate concerns of users as if after 33 years I didn't know a bad architecture when I see one.

    I am not in anyway dismissing the problem - it affects me too. I am saying that you are not enttiled to call it bad architecture unless you can point to or provide a better.

    That is a dismissal. Something need not be the worst of all possible manifestations before it can be said that it is not meeting expectations. Think. Literally close your eyes and imagine how the world would be if what you wrote were a universal rule.

    The slow loading can be an issue, and can be discussed, without the need to apportion blame.

    One need not apportion blame when there is only one entity whose fault it could possibly be. Daz Studio does not grow like mold on bread nor is it delivered from the heavens on stone tablets; Everyone knows from where it comess.

    foreheads, bricks walls, etc., but it's fascinating to see your clarity completely missed here.

    rah rah daz.

    --ms

    Post edited by mindsong on
  • thenoobduckythenoobducky Posts: 68
    edited August 2021

    Joe2018 said:

    The load time for a G8F on my system increased from 30 seconds to meanwhile 8 minutes. I think DAZ has some very good software engineers. My suggestion is a tool that enables or disables the morphs/poses/etc. that are not needed. 

    We see some freeware here in the forum that shows that this is possible. I think it is no great a great expense for DAZ to create such a tool. And I think like me, many people are also ready to pay for that.

    (Sorry for my english, that is not my native language).

    This tool exist in Daz. It is called Content Directory Manager, you can even save different combination of folders to load by using content set. What we need is something much more automated and simpler to use. Perhaps Daz should develop a tool that can read the user's mind and selectively load contents that the user was thinking about.

    Post edited by thenoobducky on
  • mindsongmindsong Posts: 1,701

    Richard Haseltine said:

    ...

    I am saying that you are not enttiled to call it bad architecture unless you can point to or provide a better.
    ...

    where would one even start, when formulating a response to this assertion?

    I'm gonna go get some work done.

    --ms

  • TheMysteryIsThePointTheMysteryIsThePoint Posts: 2,946
    edited August 2021

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

     

    Post edited by TheMysteryIsThePoint on
  • Joe2018 said:

    The load time for a G8F on my system increased from 30 seconds to meanwhile 8 minutes. I think DAZ has some very good software engineers. My suggestion is a tool that enables or disables the morphs/poses/etc. that are not needed. 

    If that is a sudden increase check the log file, assuming you aren't getting an alert for Duplicate Formulas - thosde can really slow loading down badly, on top of the usualy slow down for multiple items.

    We see some freeware here in the forum that shows that this is possible. I think it is no great a great expense for DAZ to create such a tool. And I think like me, many people are also ready to pay for that.

    (Sorry for my english, that is not my native language).

  • TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

  • mindsong said:

    Richard Haseltine said:

    ...

    I am saying that you are not enttiled to call it bad architecture unless you can point to or provide a better.
    ...

    where would one even start, when formulating a response to this assertion?

    I'm gonna go get some work done.

    --ms

    I don't know, MS. I don't know. It's actually quite clever in a certain way. Well, not really, when I can point to every other application I have ever used, much less written.

  • TheMysteryIsThePointTheMysteryIsThePoint Posts: 2,946
    edited August 2021

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

    You've lost me. Something is wrong when I not only disagree with you, but can no longer even understand your position. But that is what must happen during the defense of the indifeasible.

    Edit: Oh, I think I get it... you are interpretting it as a personal attack? A serious enquiry: Do you not know what "objective" means?

     

    Post edited by TheMysteryIsThePoint on
  • TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

    You've lost me. Something is wrong when I not only disagree with you, but can no longer even understand your position. But that is what must happen during the defense of the indifeasible.

    Edit: Oh, I think I get it... you are interpretting it as a personal attack? A serious enquiry: Do you not know what "objective" means?

    Yes, objective is her an assertion that your opinion is a fact. But if you say it is bad architecture that implies that the designer is at failt, so objectively it is an attack. As I have pointed out before, it may well be the best solution to the problem 9and indeed it does work, and works well for those with fewer properties to load, and works with an aggravating delay for the rest of us). "Bad" is a contrast to "good" - if you wish to claim the architecture is bad then you must offer an alternative (of your own design or, more reaslsitically, pointing to another application which does an equivalent job without slowing under a similar load).

  • Saxa -- SDSaxa -- SD Posts: 872
    edited August 2021

    Would like very much to see a recco for another software from a design POV.  Or aspects of it.  And then a statemement of its size and funding Modus Operandi, to be realistic.

    Remember reading for years how many CGI people complained years ago that Maya was bloatware and was too far gone and big too rebuild...properly.  Arguable it would seem.  Don't know where that is today.

    The free Blender software model is liked by quite a few.  
    Personally i do not like using it's more mathematical UI/process, or that's how I would call it.  
    DAZ is far more intuitive for me.

    Without access to financials, am still stuck arguing DAZ remains a smaller "fish" for now.
    And their money-making strategy to pay all their staff, including engineers, includes spending a good portion of time enabling features that pay for this DAZ and all of its amentities.
    "Non-profit" Blender has for the moment hit mainstream, thanx in huge part to Ton's relentless determination.  Means way more $.
    Same sorta could be said for Thomas's bridge.  "Relentless".  Though think no cash there yet?

    But a quicker easier morph management system with DS5 would be seriously appreciated. And mean more money again for DAZ.  But how much though depends on the eye of the beholder. And arguable.
    And there probably are enough people accepting morph mgmt as is, cos that's the way it is, just like they accept Win10 updates, which may cos issues.

    One option would be an easier morph activator system.
    Though as someone who uses DAZ's custom erc system alot, that quick toggle would require some kind of quick custom sorting via grouping DS windows in my mind for quicker use.  

    Semi-Closed systems like DAZ are challenging to keep relevant.
    Say semi-closed cos Plug-in developlemnt is still possible.
    $ and new features that bring in $ are in constant demand.
    So morph toggles seem more a grumble issue right now (as opposed to a big big issue).  Triage if you will.
    But maybe?

    For now choice by DAZ is to add newish MAC-operability back in, while doing QT upgrades, and getting ready for new Gen/more features for PC users.  Or that's what I read between the lines.
    Limited resources.  What to do.

    Personally wish they'd make animation stronger.

    Post edited by Saxa -- SD on
  • TheMysteryIsThePointTheMysteryIsThePoint Posts: 2,946
    edited August 2021

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

    You've lost me. Something is wrong when I not only disagree with you, but can no longer even understand your position. But that is what must happen during the defense of the indifeasible.

    Edit: Oh, I think I get it... you are interpretting it as a personal attack? A serious enquiry: Do you not know what "objective" means?

    Yes, objective is her an assertion that your opinion is a fact.

    You are dismissing all the user complaints from people that are demonstrably "not me" as "my opinion"?

    But if you say it is bad architecture that implies that the designer is at failt, so objectively it is an attack.

    One can simply not be an engineer if every objective critique of the performance characteristics of one's work is considered a personal attack. That is how better engineers are made. The practice is codified in "code reviews" where an engineer's peers review his/her work and provide feedback. This is how professional teams produce the best possible code and junior engineers turn into senior engineers.

    As I have pointed out before, it may well be the best solution to the problem 9and indeed it does work, and works well for those with fewer properties to load, and works with an aggravating delay for the rest of us).

    Richard, if an aggravating delay is "the best solution", it is already the fault of the architecture.

    "Bad" is a contrast to "good" - if you wish to claim the architecture is bad then you must offer an alternative (of your own design or, more reaslsitically, pointing to another application which does an equivalent job without slowing under a similar load).

    Believe me, if DS were open source, I already would have.

     

    Post edited by TheMysteryIsThePoint on
  • Joe2018Joe2018 Posts: 254

    Richard Haseltine said:

    Joe2018 said:

    The load time for a G8F on my system increased from 30 seconds to meanwhile 8 minutes. I think DAZ has some very good software engineers. My suggestion is a tool that enables or disables the morphs/poses/etc. that are not needed. 

    If that is a sudden increase check the log file, assuming you aren't getting an alert for Duplicate Formulas - thosde can really slow loading down badly, on top of the usualy slow down for multiple items.

    We see some freeware here in the forum that shows that this is possible. I think it is no great a great expense for DAZ to create such a tool. And I think like me, many people are also ready to pay for that.

    (Sorry for my english, that is not my native language).

    I own to much G8F and G8M Charakters - I think that is the reason while it is slow.

    At the moment I use two PC one with all morphs/characters/poses/etc., one with only the content i need "for my current project". This one is much faster.

     

  • Joe2018Joe2018 Posts: 254

    thenoobducky said:

    Joe2018 said:

    The load time for a G8F on my system increased from 30 seconds to meanwhile 8 minutes. I think DAZ has some very good software engineers. My suggestion is a tool that enables or disables the morphs/poses/etc. that are not needed. 

    We see some freeware here in the forum that shows that this is possible. I think it is no great a great expense for DAZ to create such a tool. And I think like me, many people are also ready to pay for that.

    (Sorry for my english, that is not my native language).

    This tool exist in Daz. It is called Content Directory Manager, you can even save different combination of folders to load by using content set. What we need is something much more automated and simpler to use. Perhaps Daz should develop a tool that can read the user's mind and selectively load contents that the user was thinking about.

    I know that and use that. What I like to have is a more simple tool with a better GUI. And something like load and save presets etc. 

  • thenoobduckythenoobducky Posts: 68
    edited August 2021

    Joe2018 said:

    thenoobducky said:

    Joe2018 said:

    The load time for a G8F on my system increased from 30 seconds to meanwhile 8 minutes. I think DAZ has some very good software engineers. My suggestion is a tool that enables or disables the morphs/poses/etc. that are not needed. 

    We see some freeware here in the forum that shows that this is possible. I think it is no great a great expense for DAZ to create such a tool. And I think like me, many people are also ready to pay for that.

    (Sorry for my english, that is not my native language).

    This tool exist in Daz. It is called Content Directory Manager, you can even save different combination of folders to load by using content set. What we need is something much more automated and simpler to use. Perhaps Daz should develop a tool that can read the user's mind and selectively load contents that the user was thinking about.

    I know that and use that. What I like to have is a more simple tool with a better GUI. And something like load and save presets etc. 

    Save/load preset is not sufficient, it also needs to be able to combine several presets together.

    Here are some criteria I think any solution must meets to be suitable for use by everyone. Individual bespoke solution does not have to meet them if the individual using it is ok with the trade-off.

    1. Does not break DIM install/uninstall/update process. aka no moving, deleting or copying files without DIM being aware of it. 

    2. Work manually installed contents because not all contents are DIM compatible, and not everyone uses DIM.

    3. Space efficient so don't take up much hard-drive space. No multiple copies of the same file.

    4. Load the scene with 1-2 clicks.

    5. Much faster than the existing system. 

    6. Does not break existing scene.

    7. Does not cause any lag when the user wants to change a character's pose or shape. Otherwise bad UI experience.

    8. Easy to add new selections. Very common use case: open a scene, add a new character with different morphs, clothing, hair, or change character shape, should be as easy to make as the current process.

    9. Allow easy and quick switch between multiple scenes.

    10. Reproducible, use case: if I move my content library to another location, it is very easy and quick to make changes to make the scene load again.

     

     

     

    Post edited by thenoobducky on
  • Joe2018 said:

    Richard Haseltine said:

    Joe2018 said:

    The load time for a G8F on my system increased from 30 seconds to meanwhile 8 minutes. I think DAZ has some very good software engineers. My suggestion is a tool that enables or disables the morphs/poses/etc. that are not needed. 

    If that is a sudden increase check the log file, assuming you aren't getting an alert for Duplicate Formulas - thosde can really slow loading down badly, on top of the usualy slow down for multiple items.

    We see some freeware here in the forum that shows that this is possible. I think it is no great a great expense for DAZ to create such a tool. And I think like me, many people are also ready to pay for that.

    (Sorry for my english, that is not my native language).

    I own to much G8F and G8M Charakters - I think that is the reason while it is slow.

    At the moment I use two PC one with all morphs/characters/poses/etc., one with only the content i need "for my current project". This one is much faster.

    Yes, if it has ben a slow increase in load speed that is just the nuimber of morphs - I read your comment as a sudden jump by a factor of about three on the fully-loaded version, which would tend to indicate an actual issue with one or more recent additons (probably conflicting with an existing morph rather than itself being at fault) rather than simply the cumulative slowing. Nine minutes does sound excessive, unless there is some other compounding factor (such as security software that parses each file loaded).

  • TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

    You've lost me. Something is wrong when I not only disagree with you, but can no longer even understand your position. But that is what must happen during the defense of the indifeasible.

    Edit: Oh, I think I get it... you are interpretting it as a personal attack? A serious enquiry: Do you not know what "objective" means?

    Yes, objective is her an assertion that your opinion is a fact.

    You are dismissing all the user complaints from people that are demonstrably "not me" as "my opinion"?

    I'm not even dismissing your complaint, still less those of others.

    But if you say it is bad architecture that implies that the designer is at failt, so objectively it is an attack.

    One can simply not be an engineer if every objective critique of the performance characteristics of one's work is considered a personal attack. That is how better engineers are made. The practice is codified in "code reviews" where an engineer's peers review his/her work and provide feedback. This is how professional teams produce the best possible code and junior engineers turn into senior engineers.

    It should be possible to give an analysis of the situation and consider possible alternative approaches or mitigating strategies without indulging in attacks. That has been my sole point in arguing with you.

    As I have pointed out before, it may well be the best solution to the problem 9and indeed it does work, and works well for those with fewer properties to load, and works with an aggravating delay for the rest of us).

    Richard, if an aggravating delay is "the best solution", it is already the fault of the architecture.

    Perhaps you mean something different by architecture, rather than the way the system meets the design goals?

    "Bad" is a contrast to "good" - if you wish to claim the architecture is bad then you must offer an alternative (of your own design or, more reaslsitically, pointing to another application which does an equivalent job without slowing under a similar load).

    Believe me, if DS were open source, I already would have.

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

  • mindsongmindsong Posts: 1,701

    Richard Haseltine said:

    ...

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    wow. just ... wow.

    --ms

  • Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

    You've lost me. Something is wrong when I not only disagree with you, but can no longer even understand your position. But that is what must happen during the defense of the indifeasible.

    Edit: Oh, I think I get it... you are interpretting it as a personal attack? A serious enquiry: Do you not know what "objective" means?

    Yes, objective is her an assertion that your opinion is a fact.

    You are dismissing all the user complaints from people that are demonstrably "not me" as "my opinion"?

    I'm not even dismissing your complaint, still less those of others.

    I think "Yes, objective is her an assertion that your opinion is a fact." after a completely objective analysis (the opposite of "opinion") was given is as close to "dissmissive" as one can get.

    But if you say it is bad architecture that implies that the designer is at failt, so objectively it is an attack.

    One can simply not be an engineer if every objective critique of the performance characteristics of one's work is considered a personal attack. That is how better engineers are made. The practice is codified in "code reviews" where an engineer's peers review his/her work and provide feedback. This is how professional teams produce the best possible code and junior engineers turn into senior engineers.

    It should be possible to give an analysis of the situation and consider possible alternative approaches or mitigating strategies without indulging in attacks. That has been my sole point in arguing with you.

    Instead of simply repeating that I've "indulged" in attacks, perhaps you could quote me referring to anything other than the empirically observed qualities of the architecture that everyone else has complained about so extensively as well. I honestly have no idea what you are talking about.

    As I have pointed out before, it may well be the best solution to the problem 9and indeed it does work, and works well for those with fewer properties to load, and works with an aggravating delay for the rest of us).

    Richard, if an aggravating delay is "the best solution", it is already the fault of the architecture.

    Perhaps you mean something different by architecture, rather than the way the system meets the design goals?

    I am simply not going to defend my understanding of what Architecture is. This especially when, with your own definition of it, you've validated my point but yet continue arguing as if you had instead successfully refuted it.

    "Bad" is a contrast to "good" - if you wish to claim the architecture is bad then you must offer an alternative (of your own design or, more reaslsitically, pointing to another application which does an equivalent job without slowing under a similar load).

    Believe me, if DS were open source, I already would have.

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    This is a completely uninformed response. Richard, there is always a tradeoff, at the simplest level between time and space. I don't need to see the code, nor know the solution, to know that one exists. It is sometimes useful to think in the abstract.

    But the thing that you've said that makes the least sense is actually this: You yourself have said that this will eventually be fixed, and I agree. But how do you think they are going to do that, if worse than O(log n) complexity is, as you say, just the somehow inevitable nature of the problem? If they cannot change the algorithms dictated by the current architecture, they have to change that architecture to dictate other ones, i.e. the current architecture is bad, by your argument.

    Not only do I not understand your point nor your objection, I am beginning to suspect that you don't either.

     

  • TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    As I said, there is nothing wronf with pointing out the issue or with lobbying for a revision of the system or , if oen can be devised, a better system - but it should be donen in a way that does not make assumptions about what is possible given the design goals
    or what was foreseeable in regards to how far it would need to scale.

    With the highlited in red, your argument is slowly morphing (no pun intended) into one you might be able to defend. Curious is that it's morphing towards exactly what I have been saying since the beginning.

    I don't know why you keep trying to make this personal. It does not matter at all if how far it needs to scale was foreseeable or not foreseeable. That would only matter if my only interest were in trying to assign blame. I really don't care who is to blame, it can only be one entity anyway, and it would be a Quixotian exercise by any means.

    It is a fact, independent of me or how I state it, that the environment in which Daz Studio is run has gone outside the conditions in which the architecture performs well. For many, many of its users. It is also a fact, and this is Computer Science, not my opinion, that algorithms with worse than O(log n) complexity will kill you. Eventually. If that's just how it is and truly can't be helped, there has to be something in the architecture to mitigate that. There are always many things that can be done (and to me this is the definition of Engineering: knowing the tradeoffs so that the system performs satisfactorily by all metrics, not stellar at some but dismal at others), but in DS, there does not appear to be such a thing.

    It is objectively a bad architecture.

    I'm sorry, but your last paragraph flatly contradicts the preceeding disavowals. And that is the bit I am arguing with.

    You've lost me. Something is wrong when I not only disagree with you, but can no longer even understand your position. But that is what must happen during the defense of the indifeasible.

    Edit: Oh, I think I get it... you are interpretting it as a personal attack? A serious enquiry: Do you not know what "objective" means?

    Yes, objective is her an assertion that your opinion is a fact.

    You are dismissing all the user complaints from people that are demonstrably "not me" as "my opinion"?

    I'm not even dismissing your complaint, still less those of others.

    I think "Yes, objective is her an assertion that your opinion is a fact." after a completely objective analysis (the opposite of "opinion") was given is as close to "dissmissive" as one can get.

    I will repeat this one more time: I am not denying that loading can get painfully slow, I am disputing your assumption - and it is an assumption - that this means bad architecture, unless you are making the trivial point that since the process can get slow then it is a process prome to slowness without any implication that there might have been much better ways to achieve the same end. That doesn't, of course, mean that there definitely aren't better ways to achieve the same end - but neither of us is in a position to know, and therefore neither of us is in a postion to point fingers.

    But if you say it is bad architecture that implies that the designer is at failt, so objectively it is an attack.

    One can simply not be an engineer if every objective critique of the performance characteristics of one's work is considered a personal attack. That is how better engineers are made. The practice is codified in "code reviews" where an engineer's peers review his/her work and provide feedback. This is how professional teams produce the best possible code and junior engineers turn into senior engineers.

    It should be possible to give an analysis of the situation and consider possible alternative approaches or mitigating strategies without indulging in attacks. That has been my sole point in arguing with you.

    Instead of simply repeating that I've "indulged" in attacks, perhaps you could quote me referring to anything other than the empirically observed qualities of the architecture that everyone else has complained about so extensively as well. I honestly have no idea what you are talking about.

    As I have pointed out before, it may well be the best solution to the problem 9and indeed it does work, and works well for those with fewer properties to load, and works with an aggravating delay for the rest of us).

    Richard, if an aggravating delay is "the best solution", it is already the fault of the architecture.

    Perhaps you mean something different by architecture, rather than the way the system meets the design goals?

    I am simply not going to defend my understanding of what Architecture is. This especially when, with your own definition of it, you've validated my point but yet continue arguing as if you had instead successfully refuted it.

    "Bad" is a contrast to "good" - if you wish to claim the architecture is bad then you must offer an alternative (of your own design or, more reaslsitically, pointing to another application which does an equivalent job without slowing under a similar load).

    Believe me, if DS were open source, I already would have.

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    This is a completely uninformed response. Richard, there is always a tradeoff, at the simplest level between time and space. I don't need to see the code, nor know the solution, to know that one exists. It is sometimes useful to think in the abstract.

    You do, however, need to know what the options are to say that a "bad" choice was made. Once again, it may be possible to improve the change of load time with proeprty count - I don't know that it isn't, you don't know that it is (it seems likely that soem variation might be possible, but whether it would be significant is another matter that doesn't have an answer from general principles.)

    But the thing that you've said that makes the least sense is actually this: You yourself have said that this will eventually be fixed, and I agree. But how do you think they are going to do that, if worse than O(log n) complexity is, as you say, just the somehow inevitable nature of the problem? If they cannot change the algorithms dictated by the current architecture, they have to change that architecture to dictate other ones, i.e. the current architecture is bad, by your argument.

    I haven't said it will be fixed, I have said that daz is (obviously) aware of the situation and I am sure they are looking into it. How they will address it I don't know - whether they wuill be able pull a rabbit out of a hat and make soemthing that gives the same result more quickly and with greater scalability, or introduce some kind of management system (like the old PowerLoader for the fourth geenration figures and the scripts some end users have written), or take a different approach entirely (which would quite possibly require a new figure series) I have no idea nor can I guess how far, if at all, they will be able to improve actual performance without sacrificing features.

    Not only do I not understand your point nor your objection, I am beginning to suspect that you don't either.

     

  • mindsong said:

    Richard Haseltine said:

    ...

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    wow. just ... wow.

    --ms

    Please do note that that was in response to ana ssertion that the other poster would be able to write a better system - I don't hold with the "if you can't do better you can't criticise" school, and have previously been careful to say that making a case that the current system is sub-par could be made by pointing to another system that did a similar job with better scalability.

  • Richard Haseltine said:

    I will repeat this one more time: I am not denying that loading can get painfully slow, I am disputing your assumption - and it is an assumption - that this means bad architecture, unless you are making the trivial point that since the process can get slow then it is a process prome to slowness without any implication that there might have been much better ways to achieve the same end. That doesn't, of course, mean that there definitely aren't better ways to achieve the same end - but neither of us is in a position to know, and therefore neither of us is in a postion to point fingers.

    You are wrong about that, as well. Here is why: On one extreme, we have what the system appears to be doing: at least parsing lots of morphs that do not deform any mesh. On the other extreme, a node's duf file contains references to only the morphs used, and those files in turn have all the vertex offsets necessary to know how to deform the mesh. DS's source code may be locked away, but anyone can read the DSON spec as well as see it in action in an actual .duf file for themselves and protect themselves against false statements like the one you made above.

    One can simply not be an engineer if every objective critique of the performance characteristics of one's work is considered a personal attack. That is how better engineers are made. The practice is codified in "code reviews" where an engineer's peers review his/her work and provide feedback. This is how professional teams produce the best possible code and junior engineers turn into senior engineers.

    It should be possible to give an analysis of the situation and consider possible alternative approaches or mitigating strategies without indulging in attacks. That has been my sole point in arguing with you.

    Instead of simply repeating that I've "indulged" in attacks, perhaps you could quote me referring to anything other than the empirically observed qualities of the architecture that everyone else has complained about so extensively as well. I honestly have no idea what you are talking about.

    As I have pointed out before, it may well be the best solution to the problem 9and indeed it does work, and works well for those with fewer properties to load, and works with an aggravating delay for the rest of us).

    Richard, if an aggravating delay is "the best solution", it is already the fault of the architecture.

    Perhaps you mean something different by architecture, rather than the way the system meets the design goals?

    I am simply not going to defend my understanding of what Architecture is. This especially when, with your own definition of it, you've validated my point but yet continue arguing as if you had instead successfully refuted it.

    "Bad" is a contrast to "good" - if you wish to claim the architecture is bad then you must offer an alternative (of your own design or, more reaslsitically, pointing to another application which does an equivalent job without slowing under a similar load).

    Believe me, if DS were open source, I already would have.

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    This is a completely uninformed response. Richard, there is always a tradeoff, at the simplest level between time and space. I don't need to see the code, nor know the solution, to know that one exists. It is sometimes useful to think in the abstract.

    You do, however, need to know what the options are to say that a "bad" choice was made. Once again, it may be possible to improve the change of load time with proeprty count - I don't know that it isn't, you don't know that it is (it seems likely that soem variation might be possible, but whether it would be significant is another matter that doesn't have an answer from general principles.)

    I don't think that you have developed enough complex computing systems in your lifetime to have developed much of an intuition about these things. Richard, of course it is possible. You are doing exactly what you say cannot be done: creating your "answer from general principles" and it is causing you to write things that any experienced software engineer would find nonsensical. Think about what I just wrote.

    But the thing that you've said that makes the least sense is actually this: You yourself have said that this will eventually be fixed, and I agree. But how do you think they are going to do that, if worse than O(log n) complexity is, as you say, just the somehow inevitable nature of the problem? If they cannot change the algorithms dictated by the current architecture, they have to change that architecture to dictate other ones, i.e. the current architecture is bad, by your argument.

    I haven't said it will be fixed, I have said that daz is (obviously) aware of the situation and I am sure they are looking into it. How they will address it I don't know - whether they wuill be able pull a rabbit out of a hat and make soemthing that gives the same result more quickly and with greater scalability, or introduce some kind of management system (like the old PowerLoader for the fourth geenration figures and the scripts some end users have written), or take a different approach entirely (which would quite possibly require a new figure series) I have no idea nor can I guess how far, if at all, they will be able to improve actual performance without sacrificing features.

    So, you are saying that they're going to improve the architecture. That's what I've been saying as well; thank you for the unlikely assist.

     

  • thenoobduckythenoobducky Posts: 68
    edited August 2021

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    I will repeat this one more time: I am not denying that loading can get painfully slow, I am disputing your assumption - and it is an assumption - that this means bad architecture, unless you are making the trivial point that since the process can get slow then it is a process prome to slowness without any implication that there might have been much better ways to achieve the same end. That doesn't, of course, mean that there definitely aren't better ways to achieve the same end - but neither of us is in a position to know, and therefore neither of us is in a postion to point fingers.

    You are wrong about that, as well. Here is why: On one extreme, we have what the system appears to be doing: at least parsing lots of morphs that do not deform any mesh. On the other extreme, a node's duf file contains references to only the morphs used, and those files in turn have all the vertex offsets necessary to know how to deform the mesh. DS's source code may be locked away, but anyone can read the DSON spec as well as see it in action in an actual .duf file for themselves and protect themselves against false statements like the one you made above.

    Except that is not the root cause of the problem. The biggest slowdown seems to be caused by Daz needing to create every slider in shape and posing pane with their associated erc formulas. It observation come from looking at filesystem activity when Daz load a figure and looking at what it actually read. 

    The slider is a desired feature because it is beginner-friendly and many people want it. This is inherently an O(n) process. Daz needs to go through every file to get the sliders. Maybe Daz can optimise the process to be more efficient, or making it multi-threaded. But multi-threading would only buy you 2-5x performance improvement, considering most CPUs only have 4-8 cores. People have suggested only parsing the file to create dummy sliders and formulas and create the actual formulas as needed. But it is not clear to me that not creating the actual formula would be that much faster since the file still needs to be parsed completely.

    Seems to me that the only solution is to go outside of the box and change the requirements. What Daz really needs is a way for the user to specify that it should load all morphs from a list of folders, all morphs from a list of product, and allow the user selectively loading morphs from the rest later.

    Post edited by thenoobducky on
  • mindsongmindsong Posts: 1,701

    Richard Haseltine said:

    mindsong said:

    Richard Haseltine said:

    ...

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    wow. just ... wow.

    --ms

    Please do note that that was in response to ana ssertion that the other poster would be able to write a better system - I don't hold with the "if you can't do better you can't criticise" school, and have previously been careful to say that making a case that the current system is sub-par could be made by pointing to another system that did a similar job with better scalability.

    While it appears that the problem being discussed is sluggish figure loading, based on simple to describe, predict, and repeat DS behaviors...

    I propose that a parallel issue being discussed here is the audacity of savvy programmer/users that would, without seeing the code (!), speculate as to why this is occurring, and even more irritating, folks that would propose reasons as to why it's still happening after all of these years and suggest (!) possible solutions.

    How dare they!

    This is hardly a new behavior in here, and in-fact it seems to be an open wound that keeps getting (usually inadverdently) poked by rational and frustrated programmer/users who seem to know this business pretty well.

    Amidst this bantor, it appears that both sides are arguing about a black box that neither actually knows about with any real certainty, which presents as the recurring position of the frustrated defense: "you're speculating, you can't know that" - and this is true - we certainly can't know what's actually going on without seeing the code.

    Yet, does not the DS black box speak to us who choose to listen, and us who know what to listen for?

    *Regardless* the details of the code, the behavior in question is consistent and predictable, so perhaps it isn't so presumptuous for some of use who have struggled with this very problem to speculate about what might be going on in that DS black-box. I don't get why that's so taboo in here. It's what we do - how we're wired!

    Perhaps there's some code in there that was written a long time ago (DS1.0), by some long-gone coder, at a time when 200meg drive, 256megs of memory and a 1024x768 moniter was hi-end (640K anyone?) and a V4 figure had 150 morphs on a good day. Likely that old code that serves as the foundation to the entire DS code framework. Code that, to fix, would effectively require a full rewrite of the DS code-base and all of the dependent supporting libraries. Darn that sounds like a lot of work, and *very* expensive. I'd probably consider a full re-write (DS5), rather than an expensive repair effort on an EOLife product version (DS4.x).

    (yeah, I know, I can't know that, silly me...)

    But, i certainly wouldn't castigate those who *would* speculate about such application behaviors, when in general, they're probably more right than wrong - based on both the available evidence and their own experience with one of the most classic of computer-science (and project budget) problems.

    I would counter to someone arguing the contrary, yet *also* not having access to the code, that to simply assume that such speculative analysis is without merit is equally speculative, and comes across as, well... comes across as speculative (lets see if i can keep this post from being filtered).

    No, the better part of this discussion isn't about big-0 code design algorithms and possible data optimisation opportunities, it's about knowing the unknowable - based on observed and predictable DS behavior, and it's about business decisions in tension with available resources, and technological trade-offs.

    This isn't personal, at least not until a viable speculative assertion is met with a 'you simply can't know that' and summarily dismissed without counter-substantiation - even if also speculative.

    I have high confidence that if allowed, those who critique and speculate about the situation would be able to *quickly* identify and assess the problem code, and also (but less quickly?) be able to determine the implications and costs of fixing that code, along with the cascading side-effects of those changes, and probably come to a conclusion that would result in the very situation we're seeing now from the DAZ development team.

    Just a guess. And I would fall out of my chair if a DAZ-dev chimed in and said something like "the original code is too expensive to fix - wait 'til you see the DS5 re-write!", or "that code could be fixed, but it works well-enough for now, relative to the other tasks we think are more pressing or beneficial to the community/company".

    But, I would also assert that if the code were open-source (it'd be nice, but I have no expectation of such, nor resentment for the lack of), the active and savvy critics like myself and TMitP would walk their talk, and (assuming the code were salvagable) do the required work to upgrade the code and its dependencies, and do it well - regardless how silly that might seem. To assume otherwise, with the available evidence (see TMitP's Sagan and related projects - both the existence of, and the and quality of the code) is irrational and insulting. At least to my eye.

    Per the original issue, I don't see the business case for retro-fitting the DS 4.x figure management core with major and likely disruptive fixes when DS 5.x is on the horizon, unless the code can be fixed one-time to work in both. I would be leaving the 4.x legacy app as stable as I could possibly make it pre-update migration.

    That said, we'll certainly be sensitized to this current (big-O) behavior when DS 5.x comes out, and I'm sure we'll form lotsa opinions about the quality of the DS programmers and newly updated system code's (black box) design and architecture when DS 5.x is released.

    No pressure, guys.

    (And personally, I have nothing but respect for TMitP's walk of his talk, and appreciate that Richard is one of the most consistent, generous, and reliable providers of help and solutions for the wayward, confused, curious, and outright lost DS users that come to these forums for advice and assistance.)

    --ms

  • thenoobducky said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    I will repeat this one more time: I am not denying that loading can get painfully slow, I am disputing your assumption - and it is an assumption - that this means bad architecture, unless you are making the trivial point that since the process can get slow then it is a process prome to slowness without any implication that there might have been much better ways to achieve the same end. That doesn't, of course, mean that there definitely aren't better ways to achieve the same end - but neither of us is in a position to know, and therefore neither of us is in a postion to point fingers.

    You are wrong about that, as well. Here is why: On one extreme, we have what the system appears to be doing: at least parsing lots of morphs that do not deform any mesh. On the other extreme, a node's duf file contains references to only the morphs used, and those files in turn have all the vertex offsets necessary to know how to deform the mesh. DS's source code may be locked away, but anyone can read the DSON spec as well as see it in action in an actual .duf file for themselves and protect themselves against false statements like the one you made above.

    Except that is not the root cause of the problem. The biggest slowdown seems to be caused by Daz needing to create every slider in shape and posing pane with their associated erc formulas. It observation come from looking at filesystem activity when Daz load a figure and looking at what it actually read. 

    The slider is a desired feature because it is beginner-friendly and many people want it. This is inherently an O(n) process. Daz needs to go through every file to get the sliders. Maybe Daz can optimise the process to be more efficient, or making it multi-threaded. But multi-threading would only buy you 2-5x performance improvement, considering most CPUs only have 4-8 cores. People have suggested only parsing the file to create dummy sliders and formulas and create the actual formulas as needed. But it is not clear to me that not creating the actual formula would be that much faster since the file still needs to be parsed completely.

    Seems to me that the only solution is to go outside of the box and change the requirements. What Daz really needs is a way for the user to specify that it should load all morphs from a list of folders, all morphs from a list of product, and allow the user selectively loading morphs from the rest later.

    It seems like there are many, many things that could be done, just none of them are being done.

  • mindsong said:

    Richard Haseltine said:

    mindsong said:

    Richard Haseltine said:

    ...

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    wow. just ... wow.

    --ms

    Please do note that that was in response to ana ssertion that the other poster would be able to write a better system - I don't hold with the "if you can't do better you can't criticise" school, and have previously been careful to say that making a case that the current system is sub-par could be made by pointing to another system that did a similar job with better scalability.

    While it appears that the problem being discussed is sluggish figure loading, based on simple to describe, predict, and repeat DS behaviors...

    I propose that a parallel issue being discussed here is the audacity of savvy programmer/users that would, without seeing the code (!), speculate as to why this is occurring, and even more irritating, folks that would propose reasons as to why it's still happening after all of these years and suggest (!) possible solutions.

    How dare they!

    This is hardly a new behavior in here, and in-fact it seems to be an open wound that keeps getting (usually inadverdently) poked by rational and frustrated programmer/users who seem to know this business pretty well.

    Amidst this bantor, it appears that both sides are arguing about a black box that neither actually knows about with any real certainty, which presents as the recurring position of the frustrated defense: "you're speculating, you can't know that" - and this is true - we certainly can't know what's actually going on without seeing the code.

    Yet, does not the DS black box speak to us who choose to listen, and us who know what to listen for?

    *Regardless* the details of the code, the behavior in question is consistent and predictable, so perhaps it isn't so presumptuous for some of use who have struggled with this very problem to speculate about what might be going on in that DS black-box. I don't get why that's so taboo in here. It's what we do - how we're wired!

    Perhaps there's some code in there that was written a long time ago (DS1.0), by some long-gone coder, at a time when 200meg drive, 256megs of memory and a 1024x768 moniter was hi-end (640K anyone?) and a V4 figure had 150 morphs on a good day. Likely that old code that serves as the foundation to the entire DS code framework. Code that, to fix, would effectively require a full rewrite of the DS code-base and all of the dependent supporting libraries. Darn that sounds like a lot of work, and *very* expensive. I'd probably consider a full re-write (DS5), rather than an expensive repair effort on an EOLife product version (DS4.x).

    (yeah, I know, I can't know that, silly me...)

    But, i certainly wouldn't castigate those who *would* speculate about such application behaviors, when in general, they're probably more right than wrong - based on both the available evidence and their own experience with one of the most classic of computer-science (and project budget) problems.

    I would counter to someone arguing the contrary, yet *also* not having access to the code, that to simply assume that such speculative analysis is without merit is equally speculative, and comes across as, well... comes across as speculative (lets see if i can keep this post from being filtered).

    No, the better part of this discussion isn't about big-0 code design algorithms and possible data optimisation opportunities, it's about knowing the unknowable - based on observed and predictable DS behavior, and it's about business decisions in tension with available resources, and technological trade-offs.

    This isn't personal, at least not until a viable speculative assertion is met with a 'you simply can't know that' and summarily dismissed without counter-substantiation - even if also speculative.

    I have high confidence that if allowed, those who critique and speculate about the situation would be able to *quickly* identify and assess the problem code, and also (but less quickly?) be able to determine the implications and costs of fixing that code, along with the cascading side-effects of those changes, and probably come to a conclusion that would result in the very situation we're seeing now from the DAZ development team.

    Just a guess. And I would fall out of my chair if a DAZ-dev chimed in and said something like "the original code is too expensive to fix - wait 'til you see the DS5 re-write!", or "that code could be fixed, but it works well-enough for now, relative to the other tasks we think are more pressing or beneficial to the community/company".

    But, I would also assert that if the code were open-source (it'd be nice, but I have no expectation of such, nor resentment for the lack of), the active and savvy critics like myself and TMitP would walk their talk, and (assuming the code were salvagable) do the required work to upgrade the code and its dependencies, and do it well - regardless how silly that might seem. To assume otherwise, with the available evidence (see TMitP's Sagan and related projects - both the existence of, and the and quality of the code) is irrational and insulting. At least to my eye.

    Per the original issue, I don't see the business case for retro-fitting the DS 4.x figure management core with major and likely disruptive fixes when DS 5.x is on the horizon, unless the code can be fixed one-time to work in both. I would be leaving the 4.x legacy app as stable as I could possibly make it pre-update migration.

    That said, we'll certainly be sensitized to this current (big-O) behavior when DS 5.x comes out, and I'm sure we'll form lotsa opinions about the quality of the DS programmers and newly updated system code's (black box) design and architecture when DS 5.x is released.

    No pressure, guys.

    (And personally, I have nothing but respect for TMitP's walk of his talk, and appreciate that Richard is one of the most consistent, generous, and reliable providers of help and solutions for the wayward, confused, curious, and outright lost DS users that come to these forums for advice and assistance.)

    --ms

    Very well put, as usual, MS.

  • mindsong said:

    Richard Haseltine said:

    mindsong said:

    Richard Haseltine said:

    ...

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    wow. just ... wow.

    --ms

    Please do note that that was in response to ana ssertion that the other poster would be able to write a better system - I don't hold with the "if you can't do better you can't criticise" school, and have previously been careful to say that making a case that the current system is sub-par could be made by pointing to another system that did a similar job with better scalability.

    While it appears that the problem being discussed is sluggish figure loading, based on simple to describe, predict, and repeat DS behaviors...

    I propose that a parallel issue being discussed here is the audacity of savvy programmer/users that would, without seeing the code (!), speculate as to why this is occurring, and even more irritating, folks that would propose reasons as to why it's still happening after all of these years and suggest (!) possible solutions.

    How dare they!

    I think the parallel issue being discussed here is more of equating there is a problem to be addressed with program design is bad while offering no plausible alternative solution. The current program has performance issue is not mutually exclusive to the current design is the best that can be done given the requirements. I think every here agrees that Daz have performance issue that the dev should address. 

    Saying there is an issue that the dev needs to address is fine by me. Saying the current design is bad because this other solution is much better is fine by me. What I disagree with is saying there is a problem therefore the current design is bad without offering any alternative.

     

    This is hardly a new behavior in here, and in-fact it seems to be an open wound that keeps getting (usually inadverdently) poked by rational and frustrated programmer/users who seem to know this business pretty well.

    Amidst this bantor, it appears that both sides are arguing about a black box that neither actually knows about with any real certainty, which presents as the recurring position of the frustrated defense: "you're speculating, you can't know that" - and this is true - we certainly can't know what's actually going on without seeing the code.

    Yet, does not the DS black box speak to us who choose to listen, and us who know what to listen for?

    *Regardless* the details of the code, the behavior in question is consistent and predictable, so perhaps it isn't so presumptuous for some of use who have struggled with this very problem to speculate about what might be going on in that DS black-box. I don't get why that's so taboo in here. It's what we do - how we're wired!

    Perhaps there's some code in there that was written a long time ago (DS1.0), by some long-gone coder, at a time when 200meg drive, 256megs of memory and a 1024x768 moniter was hi-end (640K anyone?) and a V4 figure had 150 morphs on a good day. Likely that old code that serves as the foundation to the entire DS code framework. Code that, to fix, would effectively require a full rewrite of the DS code-base and all of the dependent supporting libraries. Darn that sounds like a lot of work, and *very* expensive. I'd probably consider a full re-write (DS5), rather than an expensive repair effort on an EOLife product version (DS4.x).

    (yeah, I know, I can't know that, silly me...)

    But, i certainly wouldn't castigate those who *would* speculate about such application behaviors, when in general, they're probably more right than wrong - based on both the available evidence and their own experience with one of the most classic of computer-science (and project budget) problems.

    I would counter to someone arguing the contrary, yet *also* not having access to the code, that to simply assume that such speculative analysis is without merit is equally speculative, and comes across as, well... comes across as speculative (lets see if i can keep this post from being filtered).

    I think the issue here is too much speculating without basing on the available evidence and making grand statements instead of having discussions on the technical details which I think can be much more calmly discussed. Talking on the high level just naturally leads to personal attack instead of discussing the details where technical merit can actually be assessed.

    No, the better part of this discussion isn't about big-0 code design algorithms and possible data optimisation opportunities, it's about knowing the unknowable - based on observed and predictable DS behavior, and it's about business decisions in tension with available resources, and technological trade-offs.

    This isn't personal, at least not until a viable speculative assertion is met with a 'you simply can't know that' and summarily dismissed without counter-substantiation - even if also speculative.

    I have high confidence that if allowed, those who critique and speculate about the situation would be able to *quickly* identify and assess the problem code, and also (but less quickly?) be able to determine the implications and costs of fixing that code, along with the cascading side-effects of those changes, and probably come to a conclusion that would result in the very situation we're seeing now from the DAZ development team.

    Just a guess. And I would fall out of my chair if a DAZ-dev chimed in and said something like "the original code is too expensive to fix - wait 'til you see the DS5 re-write!", or "that code could be fixed, but it works well-enough for now, relative to the other tasks we think are more pressing or beneficial to the community/company".

    "that code could be fixed, but it works well-enough for now, relative to the other tasks we think are more pressing or beneficial to the community/company" is there I would bet my monkey on. Many people would no doubt be disappointed by it, but DS5 should be lets port the existing code to Qt5 as fast as possible so people can use it, instead of let's rewrite everything.

    But, I would also assert that if the code were open-source (it'd be nice, but I have no expectation of such, nor resentment for the lack of), the active and savvy critics like myself and TMitP would walk their talk, and (assuming the code were salvagable) do the required work to upgrade the code and its dependencies, and do it well - regardless how silly that might seem. To assume otherwise, with the available evidence (see TMitP's Sagan and related projects - both the existence of, and the and quality of the code) is irrational and insulting. At least to my eye.

    Per the original issue, I don't see the business case for retro-fitting the DS 4.x figure management core with major and likely disruptive fixes when DS 5.x is on the horizon, unless the code can be fixed one-time to work in both. I would be leaving the 4.x legacy app as stable as I could possibly make it pre-update migration.

    Given the wide popularity of Gen 8 contents, any fix really needs to work with Gen 8 and not break backward compatibility with DS4. 

    That said, we'll certainly be sensitized to this current (big-O) behavior when DS 5.x comes out, and I'm sure we'll form lotsa opinions about the quality of the DS programmers and newly updated system code's (black box) design and architecture when DS 5.x is released.

    No pressure, guys.

    (And personally, I have nothing but respect for TMitP's walk of his talk, and appreciate that Richard is one of the most consistent, generous, and reliable providers of help and solutions for the wayward, confused, curious, and outright lost DS users that come to these forums for advice and assistance.)

    --ms

    Let us focus more on what causes the problem, and how to fix it instead of making grand statements that don't benefit anyone.

  • TheMysteryIsThePoint said:

    thenoobducky said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    I will repeat this one more time: I am not denying that loading can get painfully slow, I am disputing your assumption - and it is an assumption - that this means bad architecture, unless you are making the trivial point that since the process can get slow then it is a process prome to slowness without any implication that there might have been much better ways to achieve the same end. That doesn't, of course, mean that there definitely aren't better ways to achieve the same end - but neither of us is in a position to know, and therefore neither of us is in a postion to point fingers.

    You are wrong about that, as well. Here is why: On one extreme, we have what the system appears to be doing: at least parsing lots of morphs that do not deform any mesh. On the other extreme, a node's duf file contains references to only the morphs used, and those files in turn have all the vertex offsets necessary to know how to deform the mesh. DS's source code may be locked away, but anyone can read the DSON spec as well as see it in action in an actual .duf file for themselves and protect themselves against false statements like the one you made above.

    Except that is not the root cause of the problem. The biggest slowdown seems to be caused by Daz needing to create every slider in shape and posing pane with their associated erc formulas. It observation come from looking at filesystem activity when Daz load a figure and looking at what it actually read. 

    The slider is a desired feature because it is beginner-friendly and many people want it. This is inherently an O(n) process. Daz needs to go through every file to get the sliders. Maybe Daz can optimise the process to be more efficient, or making it multi-threaded. But multi-threading would only buy you 2-5x performance improvement, considering most CPUs only have 4-8 cores. People have suggested only parsing the file to create dummy sliders and formulas and create the actual formulas as needed. But it is not clear to me that not creating the actual formula would be that much faster since the file still needs to be parsed completely.

    Seems to me that the only solution is to go outside of the box and change the requirements. What Daz really needs is a way for the user to specify that it should load all morphs from a list of folders, all morphs from a list of product, and allow the user selectively loading morphs from the rest later.

    It seems like there are many, many things that could be done, just none of them are being done.

    Lets discuss what are those "many, many things" and which option you would prefer. Although given the dev's stated priority in the DS5 thread being getting the program out as soon as possible, I wouldn't hold hope for change until DS5 is released and bug fixed.

  • mindsongmindsong Posts: 1,701

    thenoobducky said:

    mindsong said:

    Richard Haseltine said:

    mindsong said:

    Richard Haseltine said:

    ...

    Well, if you actually know how to meet the design objectives without the lag on large data sets then you should be able to describe the process without needing acess to the code; if you don't have an actual solution then it is quite possible that you would do no better than the daz developers if you did have access to the code.

    wow. just ... wow.

    --ms

    Please do note that that was in response to ana ssertion that the other poster would be able to write a better system - I don't hold with the "if you can't do better you can't criticise" school, and have previously been careful to say that making a case that the current system is sub-par could be made by pointing to another system that did a similar job with better scalability.

    While it appears that the problem being discussed is sluggish figure loading, based on simple to describe, predict, and repeat DS behaviors...

    I propose that a parallel issue being discussed here is the audacity of savvy programmer/users that would, without seeing the code (!), speculate as to why this is occurring, and even more irritating, folks that would propose reasons as to why it's still happening after all of these years and suggest (!) possible solutions.

    How dare they!

    I think the parallel issue being discussed here is more of equating there is a problem to be addressed with program design is bad while offering no plausible alternative solution. The current program has performance issue is not mutually exclusive to the current design is the best that can be done given the requirements. I think every here agrees that Daz have performance issue that the dev should address. 

    Saying there is an issue that the dev needs to address is fine by me. Saying the current design is bad because this other solution is much better is fine by me. What I disagree with is saying there is a problem therefore the current design is bad without offering any alternative.

     

    This is hardly a new behavior in here, and in-fact it seems to be an open wound that keeps getting (usually inadverdently) poked by rational and frustrated programmer/users who seem to know this business pretty well.

    Amidst this bantor, it appears that both sides are arguing about a black box that neither actually knows about with any real certainty, which presents as the recurring position of the frustrated defense: "you're speculating, you can't know that" - and this is true - we certainly can't know what's actually going on without seeing the code.

    Yet, does not the DS black box speak to us who choose to listen, and us who know what to listen for?

    *Regardless* the details of the code, the behavior in question is consistent and predictable, so perhaps it isn't so presumptuous for some of use who have struggled with this very problem to speculate about what might be going on in that DS black-box. I don't get why that's so taboo in here. It's what we do - how we're wired!

    Perhaps there's some code in there that was written a long time ago (DS1.0), by some long-gone coder, at a time when 200meg drive, 256megs of memory and a 1024x768 moniter was hi-end (640K anyone?) and a V4 figure had 150 morphs on a good day. Likely that old code that serves as the foundation to the entire DS code framework. Code that, to fix, would effectively require a full rewrite of the DS code-base and all of the dependent supporting libraries. Darn that sounds like a lot of work, and *very* expensive. I'd probably consider a full re-write (DS5), rather than an expensive repair effort on an EOLife product version (DS4.x).

    (yeah, I know, I can't know that, silly me...)

    But, i certainly wouldn't castigate those who *would* speculate about such application behaviors, when in general, they're probably more right than wrong - based on both the available evidence and their own experience with one of the most classic of computer-science (and project budget) problems.

    I would counter to someone arguing the contrary, yet *also* not having access to the code, that to simply assume that such speculative analysis is without merit is equally speculative, and comes across as, well... comes across as speculative (lets see if i can keep this post from being filtered).

    I think the issue here is too much speculating without basing on the available evidence and making grand statements instead of having discussions on the technical details which I think can be much more calmly discussed. Talking on the high level just naturally leads to personal attack instead of discussing the details where technical merit can actually be assessed.

    No, the better part of this discussion isn't about big-0 code design algorithms and possible data optimisation opportunities, it's about knowing the unknowable - based on observed and predictable DS behavior, and it's about business decisions in tension with available resources, and technological trade-offs.

    This isn't personal, at least not until a viable speculative assertion is met with a 'you simply can't know that' and summarily dismissed without counter-substantiation - even if also speculative.

    I have high confidence that if allowed, those who critique and speculate about the situation would be able to *quickly* identify and assess the problem code, and also (but less quickly?) be able to determine the implications and costs of fixing that code, along with the cascading side-effects of those changes, and probably come to a conclusion that would result in the very situation we're seeing now from the DAZ development team.

    Just a guess. And I would fall out of my chair if a DAZ-dev chimed in and said something like "the original code is too expensive to fix - wait 'til you see the DS5 re-write!", or "that code could be fixed, but it works well-enough for now, relative to the other tasks we think are more pressing or beneficial to the community/company".

    "that code could be fixed, but it works well-enough for now, relative to the other tasks we think are more pressing or beneficial to the community/company" is there I would bet my monkey on. Many people would no doubt be disappointed by it, but DS5 should be lets port the existing code to Qt5 as fast as possible so people can use it, instead of let's rewrite everything.

    But, I would also assert that if the code were open-source (it'd be nice, but I have no expectation of such, nor resentment for the lack of), the active and savvy critics like myself and TMitP would walk their talk, and (assuming the code were salvagable) do the required work to upgrade the code and its dependencies, and do it well - regardless how silly that might seem. To assume otherwise, with the available evidence (see TMitP's Sagan and related projects - both the existence of, and the and quality of the code) is irrational and insulting. At least to my eye.

    Per the original issue, I don't see the business case for retro-fitting the DS 4.x figure management core with major and likely disruptive fixes when DS 5.x is on the horizon, unless the code can be fixed one-time to work in both. I would be leaving the 4.x legacy app as stable as I could possibly make it pre-update migration.

    Given the wide popularity of Gen 8 contents, any fix really needs to work with Gen 8 and not break backward compatibility with DS4. 

    That said, we'll certainly be sensitized to this current (big-O) behavior when DS 5.x comes out, and I'm sure we'll form lotsa opinions about the quality of the DS programmers and newly updated system code's (black box) design and architecture when DS 5.x is released.

    No pressure, guys.

    (And personally, I have nothing but respect for TMitP's walk of his talk, and appreciate that Richard is one of the most consistent, generous, and reliable providers of help and solutions for the wayward, confused, curious, and outright lost DS users that come to these forums for advice and assistance.)

    --ms

    Let us focus more on what causes the problem, and how to fix it instead of making grand statements that don't benefit anyone.

    good feedback/comments, but let me offer that, to anyone in the business of algorithms and coding (as you are), the problems we see in DS may be self-evident (big-O, etc.) enough that saying anything with any more detail is fruitless without actually seeing code. This lack of detail is likely not a function of over-simplifying pie-in-the-sky critique as much as "that problem and its likely cause is clear, why won't the developers address it already?"

    You've clearly done some great/real code (thanks), so I imagine you could see this as a possible reason for the high-level tone you describe rather than oversimplifying criticism from the peanut gallery. I don't see it as being that way. In this thread, I know that TMitP does similar and good work, so going into detail at any higher-a-level actually doesn't make much sense without seeing the code. His language and level of critique is appropriate for this context as I see it. I believe emails between him and DAZ_Rawb would have a decidely different level-of-detail.

    which comes to our point of agreement, that it's probably not the capacity of the devs, but rather the available resources as allocated. Many of these problems are probably on their lists, but unlikely to ever get done due to the more pressing tasks of the current industry, hardware, and market-place, etc., as is usually determined by 'market research' - which is understandable but sad, and possibly a miscalculation in some cases.

    and BTW, my appreciation for the practical situation doesn't at all imply that I would make the same decisions! In fact I sometimes think this place is run by lunatics (been watching to 10-some years now), yet it seems to work for them - who am i to argue with their staying power?

    That said, I tend to respect the 'current user' more than the 'future user'. Probably not good business. I don't see this as DAZ's primary paradigm - but - their immediate numbers at any given moment may confirm the 'correctness' of their chosen ethic - whatever it may be (a black box as well). But over the long-term, I believe they pay a heavy price for what I see them doing. (The Debian linux distro is a great example of a solid product that follows the 'slow-stable-steady' philosophy that I like, and is both popular and is forked more than any other distro in its domain.).

    I think DAZ mgt and the devs are making a large miscalculation in this specific domain, based on my history of updates, and what I see going on in the DS5 conversations - which has little to do with algorithms and design, and everything to do with that tension between moving into the future at the expense of an installed base. DS5 will certainly work, but will it work with my habits?, content?, and mental paradigm?, or ... will it be re-framed for a new beginner audience, or higher-margin users, or VR or gaming folks, graphic/design, animation?, or ?

    And yes, Richard, this *is* both speculation *and* judgement based on that speculation, and anyone with three neurons can see that. What's the big deal? Do you agree? Do you disagree? Come speculate with me/us, but be willing to call it that and it'll be interesting. I'd be very interested (another thread) to hear your honest (non-DAZ-moderator) thoughts on the past and future of DAZ and DS, and how you might drive the operation if you had the reigns. That'd be fascinating - especially over a beer, heh. I'm certain I'd be both surprised and/or confirmed by your insights. I would also think DAZ would be very interested in your thoughts, considering the front-line trenches you know so well. My guess is they wouldn't weigh your opinions as highly as I would. per what I mentioned above.

    The Ds/DAZ future is certainly not an easy problem with an obvious answer but the underlying trend that I sense concerns me in my workflow. But it also confirms the wisdom of my current workflow adjustments. "What DAZ mgt should do" is another thread (it'd be lively, I'm sure).

    I am hoping for some final animation refinements and fixes in the 'last' of the DS 4.x family, but with my sense of the past as my own guide to the future, I'm resigned to be content with DS 4.11 and my alternate tool-sets - and long morph-lagging figure loads, unless I repackage my oft-used characters and figures with just their essential elements - which is my current solution to the OT of this thread.

    best,

    --ms

  • mindsongmindsong Posts: 1,701

    thenoobducky said:

    ...

    but DS5 should be lets port the existing code to Qt5 as fast as possible so people can use it, instead of let's rewrite everything.

    ...

    Your comments were great, but this particular line would be the basis for a great discussion - again, over beers!

    I'm not so sure that's what's going to happen between the two options, and if I were managing the ball-of-wax, I think I'd do the re-write. But there are a lot of factors involved, so perhaps I'll state that - given the choice without much consequence - I'd usually prefer to re-write. I'd really like do do what Steve Jobs did with the Mac and the NeXT as a completly new side-project. Although it failed in its own-right, the NeXT OS became the MacOS we see today when Jobs went back to Apple.

    I'd call the new DS something like ... hmmm ... Carrara! (lol)

    best,

    --ms

     

  • thenoobducky said:

    TheMysteryIsThePoint said:

    thenoobducky said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    I will repeat this one more time: I am not denying that loading can get painfully slow, I am disputing your assumption - and it is an assumption - that this means bad architecture, unless you are making the trivial point that since the process can get slow then it is a process prome to slowness without any implication that there might have been much better ways to achieve the same end. That doesn't, of course, mean that there definitely aren't better ways to achieve the same end - but neither of us is in a position to know, and therefore neither of us is in a postion to point fingers.

    You are wrong about that, as well. Here is why: On one extreme, we have what the system appears to be doing: at least parsing lots of morphs that do not deform any mesh. On the other extreme, a node's duf file contains references to only the morphs used, and those files in turn have all the vertex offsets necessary to know how to deform the mesh. DS's source code may be locked away, but anyone can read the DSON spec as well as see it in action in an actual .duf file for themselves and protect themselves against false statements like the one you made above.

    Except that is not the root cause of the problem. The biggest slowdown seems to be caused by Daz needing to create every slider in shape and posing pane with their associated erc formulas. It observation come from looking at filesystem activity when Daz load a figure and looking at what it actually read. 

    The slider is a desired feature because it is beginner-friendly and many people want it. This is inherently an O(n) process. Daz needs to go through every file to get the sliders. Maybe Daz can optimise the process to be more efficient, or making it multi-threaded. But multi-threading would only buy you 2-5x performance improvement, considering most CPUs only have 4-8 cores. People have suggested only parsing the file to create dummy sliders and formulas and create the actual formulas as needed. But it is not clear to me that not creating the actual formula would be that much faster since the file still needs to be parsed completely.

    Seems to me that the only solution is to go outside of the box and change the requirements. What Daz really needs is a way for the user to specify that it should load all morphs from a list of folders, all morphs from a list of product, and allow the user selectively loading morphs from the rest later.

    It seems like there are many, many things that could be done, just none of them are being done.

    Lets discuss what are those "many, many things" and which option you would prefer. Although given the dev's stated priority in the DS5 thread being getting the program out as soon as possible, I wouldn't hold hope for change until DS5 is released and bug fixed.

    I was referring to you yourself, and the suggestions that Richard Haseltine suggested. But my argument is not that I know better than the devs, or that I can have a precise opinion on the subject of code that I have never seen. My position is primarily that it is not reasonable to think that the way things are now is simply the best they can possibly be, rather than the result of an architecture that was sufficient under older circumstances that is not sufficient now, because complex systems always present more opportunities for tradeoffs than professionals will have time to analyze and implement.

    These guys wrote and/or maintain Daz Studio. I'm sure they know what they are doing. It is just not reasonable to state that a certain implementation is the best possible implementation just because no one can demonstrate something better, an especially foregone argument for a closed-source app. Personally, a large portion of things I have ever written over a three decade career are things that I would have done differently if I had understood the requirements better or if they hadn't changed, had more domain knowledge of the business, had a PM that would budget for test-driven development, or had just read a certain online article sooner. I did not anticipate how people would react so viscerally to the word "bad", even when objectively used. Heck, by my own definition, I write "bad" code all the time... that is the rule rather than the exception.

     

Sign In or Register to comment.