Long Load Times – Why is this still a problem?

2»

Comments

  • Richard HaseltineRichard Haseltine Posts: 99,500

    TheMysteryIsThePoint said:

    PerttiA said:

    lamoid_5f20d3e469 said:

    So then, how do you remove duplicate formulas? And what about the myriad "formula target property not found", "formula object property not found", "modifier not found", and "modifier not created" errors?

    Another thing that causes no end of problems is that DS puts some textures for some scenes in a temp dir. If that dir is cleared, the scene load brings up error boxs. Why in the worlds would they put something critical to a scene in a temp directory?

    There appear to be many lagacy problems with the code that have gotten worse as the program has become more complex. I imagine getting rid of some of those problems would take a complete redesign and rewrite of the software. And that would then probably prevent the loading of old scene files. I am not criticising the programmers here. Such legacy problems abound in complex programs that have been expanded upon for years. There is likely no easy solution that solves most of the problems.

    The problems you list, are not DS problems. They are content related problems caused by (mostly) inexperienced content creators, often some freebies.

    The fix is to uninstall the content that is causing problems, file a ticket if it was a Daz store bought item / send feedback to the creator if it wasn't and wait for a fix.

    No, these are 100% DAZ problems. The framework is theirs, and good frameworks protect the system from errors of all kinds. That's kind of the whole point of a framework; to maintain the integrity of data loaded into the system at all times and not let user (the PAs) errors, which are inevitable, to degrade the user experience. It makes no sense to burden/entrust the carbon based part of the system to get a large set of data perfectly correct. That's what computers are for and they're a heck of a lot better at it.

    Whata re you trying to lobby for? Not reporting set-up issues? Quietly aboorting the load of content without informing the user? Not giving info in the log (which is accessible only through a specific menu lavelled "Troubleshooting") when it is potentially useful to the content creator?

  • Richard Haseltine said:

    Textures in the temp folder will most likely be the working version of Layered Images, which are stored as recipes. If you move the Layered Textures slider in Edit>Preferences>Interface rightwards, for size/speed, it will spend less time compressing the png files when it creates them and scenes with layered Images will load more quickly.

    Many of the messages you report are warning, not errors, and simply reflect the fact that some proeprties you have installed have links to other properties (e.g corrective morphs for expressions, to make them work well with the related shape) which you do not have installed.

    Thank you, Richard.

  • Richard Haseltine said:

    TheMysteryIsThePoint said:

    PerttiA said:

    lamoid_5f20d3e469 said:

    So then, how do you remove duplicate formulas? And what about the myriad "formula target property not found", "formula object property not found", "modifier not found", and "modifier not created" errors?

    Another thing that causes no end of problems is that DS puts some textures for some scenes in a temp dir. If that dir is cleared, the scene load brings up error boxs. Why in the worlds would they put something critical to a scene in a temp directory?

    There appear to be many lagacy problems with the code that have gotten worse as the program has become more complex. I imagine getting rid of some of those problems would take a complete redesign and rewrite of the software. And that would then probably prevent the loading of old scene files. I am not criticising the programmers here. Such legacy problems abound in complex programs that have been expanded upon for years. There is likely no easy solution that solves most of the problems.

    The problems you list, are not DS problems. They are content related problems caused by (mostly) inexperienced content creators, often some freebies.

    The fix is to uninstall the content that is causing problems, file a ticket if it was a Daz store bought item / send feedback to the creator if it wasn't and wait for a fix.

    No, these are 100% DAZ problems. The framework is theirs, and good frameworks protect the system from errors of all kinds. That's kind of the whole point of a framework; to maintain the integrity of data loaded into the system at all times and not let user (the PAs) errors, which are inevitable, to degrade the user experience. It makes no sense to burden/entrust the carbon based part of the system to get a large set of data perfectly correct. That's what computers are for and they're a heck of a lot better at it.

    Whata re you trying to lobby for? Not reporting set-up issues? Quietly aboorting the load of content without informing the user? Not giving info in the log (which is accessible only through a specific menu lavelled "Troubleshooting") when it is potentially useful to the content creator?

    None of the above.

    And I know better than to lobby for anything, as I've been "lobbying" for the same one thing for 6 years now.

    But how about a tool for PAs to use to prevent the problem in the first place by validating the data? You mean to tell me that no alarm bells when off when someone at DAZ said "OK, these things have to be unique, but we're going to just have a free-for-all and let everyone define them themselves..."? How about as part of the QA process that I've heard PAs talking so much about, the file gets run through an automated process that flags IDs that have already been used and either rejects the file or adjusts it to be unique? The IDs are perfectly nominal data, i.e. their only important characteristic is that they be different from any other ID. That's sort of what I mean by "framework"... there's supposed to be systems in place to protect you from yourself and preserve the integrity of the whole system. It doesn't seem like anything like that is being done, and I am wondering why if so much depends on them.

    And how about someone running the entire DAZ Studio catalog through such a tool, just let it run for days, in order of decreasing popularity and so the more popular assets remain relatively unchanged while the less popular products get updates pushed out?

     

  • frank0314frank0314 Posts: 13,922

    The only way any of this would be obtainable is if the PA and testers had every single pack in this store installed on there system to find a duplicate file with a validation tool. It's inevidable with having 10's of thousands of products that duplicate files will be made. This is why PA's started nameing files with their PA initials or pack intitials that several seems to have major issues with even though they aren't user facing files. Even naming in those initials you will get duplicates. I've done it on many of my own products and had to go back and rename the files to something obscure and I have the couple hundred products on my machine. If I didn't have all of our products on my computer there would be a duplicate when the product is submitted and unless the tester has all of my products installed they won't relise there is a duplicate, jsut among my products let alone all the products on Daz. Unless every product is installed at once on a system and compaired it with the new it will never catch a duplicate file. 

  • LeanaLeana Posts: 11,404

    I think the suggestion was to have some kind of global inventory of all product IDs that would be updated wich each new product, and then the vendor or QA team could have an utility to check if files for a new product use an existing ID.

    Problem would be how to initialize such inventory with the thousands of existing products...

  • Richard HaseltineRichard Haseltine Posts: 99,500

    Leana said:

    I think the suggestion was to have some kind of global inventory of all product IDs that would be updated wich each new product, and then the vendor or QA team could have an utility to check if files for a new product use an existing ID.

    Problem would be how to initialize such inventory with the thousands of existing products...

    Not to mention products thata re being developed concurrently. However, DS does now append an additional character string to morph names to reduce the chance of conflicts - as long as the morph is created and has ERC added inside DS, if the file is hand edited or copied from a template as an alternative route to setting up a lot of links then that may fail.

  • TheMysteryIsThePointTheMysteryIsThePoint Posts: 2,924
    edited May 30

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded? Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset? Or how about a non-technical solution in the form of guidelines for making these important IDs?

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    Post edited by TheMysteryIsThePoint on
  • Richard HaseltineRichard Haseltine Posts: 99,500

    TheMysteryIsThePoint said:

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded?

    There's an idea - it could pop up an alert, something like Duplicate Formulas Found, and discard the second formula. Sorry, but what would you expect this check to do differently from what we have?

    Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset?

    What tool?

    Or how about a non-technical solution in the form of guidelines for making these important IDs?

    Don't fiddle with the generated names and they should be fine now, prior to that use something like your PA or artist name in abbeviated form as a prefix for your morphs - this advice has been shared.

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    No they are not technically surmountable, they require things that are not really practical (like a list of names, which would inevitably be limited in scope and would also inevitably fail due to different people developing in parallel - not to mention that DS is now changing the way morphs are named to avoid, or at least greatly reduce, clashes).

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    "Certainly" is not a synonym for "It seems to me."

  • Richard Haseltine said:

    TheMysteryIsThePoint said:

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded?

    There's an idea - it could pop up an alert, something like Duplicate Formulas Found, and discard the second formula. Sorry, but what would you expect this check to do differently from what we have?

    I think it maybe wouldn't be necessary to discard it, but rather to adust it to force it to be unique. I don't know if that's viable, the point is that we are actually thinking about the problem rather than doing nothing to protect the integrity of important IDS.

    Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset?

    What tool?

    A tool that adjusts IDs to make them unique. They only need to be unique as their only important quality. Vanity is not worth much in the face of a correctly operating system.

    Or how about a non-technical solution in the form of guidelines for making these important IDs?

    Don't fiddle with the generated names and they should be fine now, prior to that use something like your PA or artist name in abbeviated form as a prefix for your morphs - this advice has been shared.

    I believe Frank has already said that even that does not work. I think we should let go of the vanity and just ensure their uniqueness, their only truly important quality.

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    No they are not technically surmountable, they require things that are not really practical (like a list of names, which would inevitably be limited in scope and would also inevitably fail due to different people developing in parallel - not to mention that DS is now changing the way morphs are named to avoid, or at least greatly reduce, clashes).

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    "Certainly" is not a synonym for "It seems to me."

    I do stand corrected. But I prefer to think that DAZ appreciates what we are discussing here and through an act of volition, choose not to do anything because the solutions are more expensive than the problem. But I give them that much credit; I mean, they did create the best character creation framework around in Genesis. That's why my guided opinion is that this is just one more sense in which DAZ Studio's smashing success has just outgrown its architecture. Load times with large libraries are irrefutable proof of that. No thought given to collisions in ID-space is irrefutable proof of that.

  • Richard HaseltineRichard Haseltine Posts: 99,500

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded?

    There's an idea - it could pop up an alert, something like Duplicate Formulas Found, and discard the second formula. Sorry, but what would you expect this check to do differently from what we have?

    I think it maybe wouldn't be necessary to discard it, but rather to adust it to force it to be unique. I don't know if that's viable, the point is that we are actually thinking about the problem rather than doing nothing to protect the integrity of important IDS.

    The problem is that Daz Studio doesn't "know" what the correct links would be - guessing that those in file X in the dsf that defines property X should be distinct from those in File Y defining property Y is probably OK, and what about proerpties that are not defined there? A human could guess that JaneDoe controlling JaneDoeHead and JaneDoeBody should be two triplets of distinct morphs, but turning that into code without also splitting things that shouldn't be split (both JaneDoeBody morphs will have links to joint centres, and those shouldn't be made distinct) is rather a different matter.

    Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset?

    What tool?

    A tool that adjusts IDs to make them unique. They only need to be unique as their only important quality. Vanity is not worth much in the face of a correctly operating system.

    Using vendor prefixes isn't vanity, it is a way to distinguish properties - to get duplicates it would require two creators with the same initialism to create a character shape with the same base name. Another option, the one DS uses now I think, is to use the count of vertices affected - that is more likely to be unique, but is also more opaque.

    Or how about a non-technical solution in the form of guidelines for making these important IDs?

    Don't fiddle with the generated names and they should be fine now, prior to that use something like your PA or artist name in abbeviated form as a prefix for your morphs - this advice has been shared.

    I believe Frank has already said that even that does not work. I think we should let go of the vanity and just ensure their uniqueness, their only truly important quality.

    Neither vertex order nor initialism are guaranteed to work, but they generally do 9as long as the links are created in DS and not copy-pasted from a previous file, at least)

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    No they are not technically surmountable, they require things that are not really practical (like a list of names, which would inevitably be limited in scope and would also inevitably fail due to different people developing in parallel - not to mention that DS is now changing the way morphs are named to avoid, or at least greatly reduce, clashes).

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    "Certainly" is not a synonym for "It seems to me."

    I do stand corrected. But I prefer to think that DAZ appreciates what we are discussing here and through an act of volition, choose not to do anything because the solutions are more expensive than the problem. But I give them that much credit; I mean, they did create the best character creation framework around in Genesis. That's why my guided opinion is that this is just one more sense in which DAZ Studio's smashing success has just outgrown its architecture. Load times with large libraries are irrefutable proof of that. No thought given to collisions in ID-space is irrefutable proof of that.

    Again, problems still exist is not synonymous with no thought was given.

  • Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded?

    There's an idea - it could pop up an alert, something like Duplicate Formulas Found, and discard the second formula. Sorry, but what would you expect this check to do differently from what we have?

    I think it maybe wouldn't be necessary to discard it, but rather to adust it to force it to be unique. I don't know if that's viable, the point is that we are actually thinking about the problem rather than doing nothing to protect the integrity of important IDS.

    The problem is that Daz Studio doesn't "know" what the correct links would be - guessing that those in file X in the dsf that defines property X should be distinct from those in File Y defining property Y is probably OK, and what about proerpties that are not defined there? A human could guess that JaneDoe controlling JaneDoeHead and JaneDoeBody should be two triplets of distinct morphs, but turning that into code without also splitting things that shouldn't be split (both JaneDoeBody morphs will have links to joint centres, and those shouldn't be made distinct) is rather a different matter.

    I don't think that I can be held to the same standard as DAZ Studio's designers; I'm not asserting that I know all the answers. Just two things: 1) that certain things are so important that they'd better be completely thought out, and 2) there's always an engineering solution based on balancing trade-offs such that none of the trade-offs is optimal, by also none is a deal breaker. I simply cannot imagine why it was thought to be OK to leave the inegrity of these IDs completely unmanaged, given their importance, and when it sounds like the you, Frank, Leana and I could have solved the problem at design time given sufficient caffeine. That is all.

    Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset?

    What tool?

    A tool that adjusts IDs to make them unique. They only need to be unique as their only important quality. Vanity is not worth much in the face of a correctly operating system.

    Using vendor prefixes isn't vanity, it is a way to distinguish properties - to get duplicates it would require two creators with the same initialism to create a character shape with the same base name. Another option, the one DS uses now I think, is to use the count of vertices affected - that is more likely to be unique, but is also more opaque.

    Or how about a non-technical solution in the form of guidelines for making these important IDs?

    Don't fiddle with the generated names and they should be fine now, prior to that use something like your PA or artist name in abbeviated form as a prefix for your morphs - this advice has been shared.

    I believe Frank has already said that even that does not work. I think we should let go of the vanity and just ensure their uniqueness, their only truly important quality.

    Neither vertex order nor initialism are guaranteed to work, but they generally do 9as long as the links are created in DS and not copy-pasted from a previous file, at least)

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    No they are not technically surmountable, they require things that are not really practical (like a list of names, which would inevitably be limited in scope and would also inevitably fail due to different people developing in parallel - not to mention that DS is now changing the way morphs are named to avoid, or at least greatly reduce, clashes).

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    "Certainly" is not a synonym for "It seems to me."

    I do stand corrected. But I prefer to think that DAZ appreciates what we are discussing here and through an act of volition, choose not to do anything because the solutions are more expensive than the problem. But I give them that much credit; I mean, they did create the best character creation framework around in Genesis. That's why my guided opinion is that this is just one more sense in which DAZ Studio's smashing success has just outgrown its architecture. Load times with large libraries are irrefutable proof of that. No thought given to collisions in ID-space is irrefutable proof of that.

    Again, problems still exist is not synonymous with no thought was given.

    I will concede that if you will concede that neither is there anything practical that distinguishes one case from the other.

     

  • Richard HaseltineRichard Haseltine Posts: 99,500

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded?

    There's an idea - it could pop up an alert, something like Duplicate Formulas Found, and discard the second formula. Sorry, but what would you expect this check to do differently from what we have?

    I think it maybe wouldn't be necessary to discard it, but rather to adust it to force it to be unique. I don't know if that's viable, the point is that we are actually thinking about the problem rather than doing nothing to protect the integrity of important IDS.

    The problem is that Daz Studio doesn't "know" what the correct links would be - guessing that those in file X in the dsf that defines property X should be distinct from those in File Y defining property Y is probably OK, and what about proerpties that are not defined there? A human could guess that JaneDoe controlling JaneDoeHead and JaneDoeBody should be two triplets of distinct morphs, but turning that into code without also splitting things that shouldn't be split (both JaneDoeBody morphs will have links to joint centres, and those shouldn't be made distinct) is rather a different matter.

    I don't think that I can be held to the same standard as DAZ Studio's designers; I'm not asserting that I know all the answers. Just two things: 1) that certain things are so important that they'd better be completely thought out, and 2) there's always an engineering solution based on balancing trade-offs such that none of the trade-offs is optimal, by also none is a deal breaker. I simply cannot imagine why it was thought to be OK to leave the inegrity of these IDs completely unmanaged, given their importance, and when it sounds like the you, Frank, Leana and I could have solved the problem at design time given sufficient caffeine. That is all.

    That is very much an article off aith, especially assuming that this isn't the optimal tradeoff given the design goals.

    Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset?

    What tool?

    A tool that adjusts IDs to make them unique. They only need to be unique as their only important quality. Vanity is not worth much in the face of a correctly operating system.

    Using vendor prefixes isn't vanity, it is a way to distinguish properties - to get duplicates it would require two creators with the same initialism to create a character shape with the same base name. Another option, the one DS uses now I think, is to use the count of vertices affected - that is more likely to be unique, but is also more opaque.

    Or how about a non-technical solution in the form of guidelines for making these important IDs?

    Don't fiddle with the generated names and they should be fine now, prior to that use something like your PA or artist name in abbeviated form as a prefix for your morphs - this advice has been shared.

    I believe Frank has already said that even that does not work. I think we should let go of the vanity and just ensure their uniqueness, their only truly important quality.

    Neither vertex order nor initialism are guaranteed to work, but they generally do 9as long as the links are created in DS and not copy-pasted from a previous file, at least)

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    No they are not technically surmountable, they require things that are not really practical (like a list of names, which would inevitably be limited in scope and would also inevitably fail due to different people developing in parallel - not to mention that DS is now changing the way morphs are named to avoid, or at least greatly reduce, clashes).

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    "Certainly" is not a synonym for "It seems to me."

    I do stand corrected. But I prefer to think that DAZ appreciates what we are discussing here and through an act of volition, choose not to do anything because the solutions are more expensive than the problem. But I give them that much credit; I mean, they did create the best character creation framework around in Genesis. That's why my guided opinion is that this is just one more sense in which DAZ Studio's smashing success has just outgrown its architecture. Load times with large libraries are irrefutable proof of that. No thought given to collisions in ID-space is irrefutable proof of that.

    Again, problems still exist is not synonymous with no thought was given.

    I will concede that if you will concede that neither is there anything practical that distinguishes one case from the other.

    Nor is there anything distonguishing it from a several month pilgrimage around the great developers seeking enlightenment on the best way to design and code an extensible, open character property system. Assertitions of fact need more than just "You can't prove this isn't so"

  • This is going to get a bit long.

     

    @TheMysteryIsThePoint
    The 'solution' you propose, for the duplicate formula error, sounds good, but there's problem's you're not considering.

     

    Let me start by restating the problem and the only solution.

    Duplicate formula errors are the result of two formulas in either one or multiple DSF/PZ2(morph) files attempting to link to the same property while having the same name/ID.

    The solution is to change the ID(s) in the file(for conflicts in a single file) or to change the ID's in one of the files, when dealing with conflicts present in separate files.

     

    TheMysteryIsThePoint,'s solution, is attempting to prevent the issue from happening.

    As stated in his post, daz(the company) creates a database of all ID's contained in all files, available through Daz3d.com, already created for use in DS. Any conflicts are corrected, either manually or through some form of automation. Subsequently submitted files for publication/distribution would be subject to the same comparison and corrections made.

    This idea could work and be fairly easily implemented, imho.

    Now for the problems with this idea.

     

    The biggest issue, not all content used by us comes from Daz3d. We purchase/procure from other websites, such as renderosity, or make our own.

    The solution to this problem, either implement forced 'unique ids'(as has been done in DS), or require a constant internet connection and make DS more intrusive then it already is.

    Neither solution works, if the developer or user aren't using the updated version of DS,or the developer or user chooses to not use the unique id's system and 'corrects' the id's with something they prefer.

    Encrypting the saved files is the only way to stop the second one.

     

    The more intrusive solution(s), which again won't work unless the dev or user update DS, would be to do a comparative between what the dev/user trys to name something during creation, or when they go to save said file.

    The problem here, no internet, no comparison, potential conflict created.

    A local DB won't work as it won't be current enough to avoid potential conflicts.

     

    And the last problem i'll bring up, user's updating.

    Not only do not all users run the latest version of DS, they don't always update their assets, and in some cases, there's no update to have(No Longer Availalbe, NLA content).

     

    In conclusion, the Duplicate formula's error, isn't a problem that can be totally solved, there are just too many variables.

     

    And in my next post, explaining the long load time problem and why it can't be solved.

    (other than by not trying to have every asset you own installed/mapped in DS)

    back in a bit.

     

     

  • Richard HaseltineRichard Haseltine Posts: 99,500

    DrunkMonkeyProductions said:

    The solution is to change the ID(s) in the file(for conflicts in a single file)

    Duplicate Formulas within a single asset file are usually the result of running ERC Freeze twice on the same property - in whichc ase the real fix would be to consolidate the changes from the two formulae into one.

  • Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Richard Haseltine said:

    TheMysteryIsThePoint said:

    Leana - That's a good point, but "thousands" or even tens of thousands is just not a significant number of assets to a digital computer. If such a system could process just one asset per second, it'd do more than 28,000 overnight.

    Frank, Richard: Or how about some type of GUID could be used. Or how about this the QA check being done dynamically, in DAZ Studio, when the asset is loaded?

    There's an idea - it could pop up an alert, something like Duplicate Formulas Found, and discard the second formula. Sorry, but what would you expect this check to do differently from what we have?

    I think it maybe wouldn't be necessary to discard it, but rather to adust it to force it to be unique. I don't know if that's viable, the point is that we are actually thinking about the problem rather than doing nothing to protect the integrity of important IDS.

    The problem is that Daz Studio doesn't "know" what the correct links would be - guessing that those in file X in the dsf that defines property X should be distinct from those in File Y defining property Y is probably OK, and what about proerpties that are not defined there? A human could guess that JaneDoe controlling JaneDoeHead and JaneDoeBody should be two triplets of distinct morphs, but turning that into code without also splitting things that shouldn't be split (both JaneDoeBody morphs will have links to joint centres, and those shouldn't be made distinct) is rather a different matter.

    I don't think that I can be held to the same standard as DAZ Studio's designers; I'm not asserting that I know all the answers. Just two things: 1) that certain things are so important that they'd better be completely thought out, and 2) there's always an engineering solution based on balancing trade-offs such that none of the trade-offs is optimal, by also none is a deal breaker. I simply cannot imagine why it was thought to be OK to leave the inegrity of these IDs completely unmanaged, given their importance, and when it sounds like the you, Frank, Leana and I could have solved the problem at design time given sufficient caffeine. That is all.

    That is very much an article off aith, especially assuming that this isn't the optimal tradeoff given the design goals.

    Yes it is, and perhaps a bit to facetious. But that's born out of the extreme anxiety I feel as a developer/architect when I try to reconcile the importance of the integrity of these IDs with the lack of any effective means to manage them.

    Or how about making the tool public so that anyone person wising to be a responsible PA could check their asset?

    What tool?

    A tool that adjusts IDs to make them unique. They only need to be unique as their only important quality. Vanity is not worth much in the face of a correctly operating system.

    Using vendor prefixes isn't vanity, it is a way to distinguish properties - to get duplicates it would require two creators with the same initialism to create a character shape with the same base name. Another option, the one DS uses now I think, is to use the count of vertices affected - that is more likely to be unique, but is also more opaque.

    Or how about a non-technical solution in the form of guidelines for making these important IDs?

    Don't fiddle with the generated names and they should be fine now, prior to that use something like your PA or artist name in abbeviated form as a prefix for your morphs - this advice has been shared.

    I believe Frank has already said that even that does not work. I think we should let go of the vanity and just ensure their uniqueness, their only truly important quality.

    Neither vertex order nor initialism are guaranteed to work, but they generally do 9as long as the links are created in DS and not copy-pasted from a previous file, at least)

    My point is that the "problems" you guys are suggesting are not really problems, as they are technically surmountable. It just doesn't seem like any effort was made to surmount them, and that's why we're where we are today.

    No they are not technically surmountable, they require things that are not really practical (like a list of names, which would inevitably be limited in scope and would also inevitably fail due to different people developing in parallel - not to mention that DS is now changing the way morphs are named to avoid, or at least greatly reduce, clashes).

    It will probably not get fixed until the status quo becomes worse that the inconvenience of the fix; there's certainly a calculus at work.

    "Certainly" is not a synonym for "It seems to me."

    I do stand corrected. But I prefer to think that DAZ appreciates what we are discussing here and through an act of volition, choose not to do anything because the solutions are more expensive than the problem. But I give them that much credit; I mean, they did create the best character creation framework around in Genesis. That's why my guided opinion is that this is just one more sense in which DAZ Studio's smashing success has just outgrown its architecture. Load times with large libraries are irrefutable proof of that. No thought given to collisions in ID-space is irrefutable proof of that.

    Again, problems still exist is not synonymous with no thought was given.

    I will concede that if you will concede that neither is there anything practical that distinguishes one case from the other.

    Nor is there anything distonguishing it from a several month pilgrimage around the great developers seeking enlightenment on the best way to design and code an extensible, open character property system. Assertitions of fact need more than just "You can't prove this isn't so"

    What could, should, or would have been done doesn't matter. In a closed, proprietary system, one can neither know nor care. One can only directly observe the broken results, all the proof one needs.

     

  • DrunkMonkeyProductions said:

    This is going to get a bit long.

     

    @TheMysteryIsThePoint
    The 'solution' you propose, for the duplicate formula error, sounds good, but there's problem's you're not considering.

    @DrunkMonkeyProductions Oh, I'm sure there are :) It really should not be expected that I, or even users who know as much as you do could, actually fix this. Engineering is a long and sinuous process or considering many things that, like you said, "sound good, but..." before one is found that isn't optimal, but can't be shown why it won't work. The suggestions I made were starting points, not ending points, but that's how it works. Your critiques are part of how the solution is found. I think he only point where I disagree with you is when you say that it just can't be fixed. That has just so rarely been the case in my experience. One of the necessary trade-offs might not be optimal and we might not like it, but that's the consequence of not addressing the issue at design time inistead of production, i.e. the best solution is not having the problem in the first place. But hindsight is 20/20... it's speculation, but this was such a bad design decision that I really just think that DAZ Studio's success went beyond the designer's wildest dreams and they didn't think it would ever really matter if they didn't protect the IDs with a rock solid method.

     

  • Richard Haseltine said:

    DrunkMonkeyProductions said:

    The solution is to change the ID(s) in the file(for conflicts in a single file)

    Duplicate Formulas within a single asset file are usually the result of running ERC Freeze twice on the same property - in whichc ase the real fix would be to consolidate the changes from the two formulae into one.

    Ok, not one i think i've encountered before.

    The only one's i've seen directly had to do with dev's copy/pasting what should have been separate files into one file and breaking it.

     

     

     

     

  • TheMysteryIsThePoint said:

    DrunkMonkeyProductions said:

    This is going to get a bit long.

     

    @TheMysteryIsThePoint
    The 'solution' you propose, for the duplicate formula error, sounds good, but there's problem's you're not considering.

    @DrunkMonkeyProductions Oh, I'm sure there are :) It really should not be expected that I, or even users who know as much as you do could, actually fix this. Engineering is a long and sinuous process or considering many things that, like you said, "sound good, but..." before one is found that isn't optimal, but can't be shown why it won't work. The suggestions I made were starting points, not ending points, but that's how it works. Your critiques are part of how the solution is found. I think he only point where I disagree with you is when you say that it just can't be fixed. That has just so rarely been the case in my experience. One of the necessary trade-offs might not be optimal and we might not like it, but that's the consequence of not addressing the issue at design time inistead of production, i.e. the best solution is not having the problem in the first place. But hindsight is 20/20... it's speculation, but this was such a bad design decision that I really just think that DAZ Studio's success went beyond the designer's wildest dreams and they didn't think it would ever really matter if they didn't protect the IDs with a rock solid method.

     

    Ok, i see the problem, i should have covered this, but my 'old timers' kicked in.

    The problem with this conclusion, is you're not considering/aware of interoperability, specifically with Poser.

    Even if DS had started with a more rigid naming scheme, this error message would still need to exist as content could, and still is, made in both programs and used in either program.

    It wasn't until DS 4 and the release of genesis 1 that a completely stand alone figure existed for DS and DS only.

     

     

    As to why it's taken almost 20 years(DS 1 released in 2005) to even implement a partial solution, i'll give the devs the benefit of the doubt and say this was a problem that couldn't be solved until now.

    Solve one problem, create 10 more, as my programming teacher used to tell me.

     

  • crosswindcrosswind Posts: 6,284
    edited June 1

    DrunkMonkeyProductions said:

    Richard Haseltine said:

    DrunkMonkeyProductions said:

    The solution is to change the ID(s) in the file(for conflicts in a single file)

    Duplicate Formulas within a single asset file are usually the result of running ERC Freeze twice on the same property - in whichc ase the real fix would be to consolidate the changes from the two formulae into one.

    Ok, not one i think i've encountered before.

    The only one's i've seen directly had to do with dev's copy/pasting what should have been separate files into one file and breaking it.

    Actually most of the duplicate formula errors resulted from the fact that the different content creators specified the same names when they created morph properties. DS ( before ver. 4.21.1.29 ) , as default, takes the property's name as URI Id...
    The way of creating UIR ID has been changed since the above-mentioned version, which prevents IDs from being duplicated at very beginning... however, I know there're still some vendors are using older versions, so...

    Then, modifying DSF(s) directly turned out to be another culprit, no matter the vendors modified the files with or without carelessness...

    The issue of ERC Refreezing could've been avoided by doing ERC Bake firstly before rerig / refreezing, but some folks didn't know about that...

    Post edited by crosswind on
  • crosswindcrosswind Posts: 6,284

    As per 80/20 principle, duplicate formula issues always fall in the part of 20% but it's been resulting in terrible loading time as well as bring users bad experience. DS had taken some actions to improve it... I appreciated that though it's still not 100% perfect.

    Well, an open world is always full of uncertainties and ramdomness. As for this issue, I personally don't think it's really possible to have a 100% perfect robustness in terms of preventing these sort of errors from happening... let alone the products from 3rd-parties or assets created by users that are out of Daz's control.

    There can never be 100% perfect software / assets but there're always more ways than problems... Daz can always give more knowledge and how-to to the users for trouble-shooting.

    For instance, DS log has been improved, but base on the log data, it'll be even better if DS can tell the users how to resolve these errors, like a sort of "analysis" or "guidance" that users can better understand ... Then solid knowledge base can be accumulated as well.

     

  • jdavison67jdavison67 Posts: 639

    I DAZ optimized to take advantage of multi-core processors? It seems to me that lots of wait times could be reduced if multiple cores could work on different tasks...

    Just wondering....

     

    JD

  • Richard HaseltineRichard Haseltine Posts: 99,500

    jdavison67 said:

    I DAZ optimized to take advantage of multi-core processors? It seems to me that lots of wait times could be reduced if multiple cores could work on different tasks...

    Just wondering....

     

    JD

    Not every process is spilttable into multiple independent threads, which is the requirement for multi-core CPUs to help.

  • IceCrMnIceCrMn Posts: 2,127

    Someone at Daz should have at least 1 computer with everything from the store installed on it.

    It would be much easier to find and deal with product conflicts such as character names, morph names, etc,etc..

  • jdavison67 said:

    I DAZ optimized to take advantage of multi-core processors? It seems to me that lots of wait times could be reduced if multiple cores could work on different tasks...

    Just wondering....

     

    JD

    I'm pretty sure they already did. I was developing a plugin to gather the scene's content into one directory, and I noticed that things got loaded in slightly different order every time... a tell-tale sign that its threaded. And it is a lot faster.

Sign In or Register to comment.