Forum Replies Created
-
AuthorPosts
-
On Jan 4, 2024 I sent an email advising users not to use the CRAN version of ssdtools for obtaining model-averaged HCx estimates. Since that time I have had a number of queries asking when the issue will be fixed and what can be done in the meantime. This post addresses both questions.
What’s the timeframe for the release of an officially-endorsed version of ssdtools?
I can’t speak for the processes and timeframes for government approval of ssdtools, but what I can say is that we (the development team) have fixed the known issues affecting the computation of both point and interval model-averaged HCx estimates. We still have more work to do on ssdtools and so an update to the CRAN version is a little ways off – we estimate 2-3 months away.
What can I do in the meantime?
We have a devlopmental branch of ssdtools on Github. The latest version (v1.0.6.9009) embeds the corrections referred to in 1 above.
Please click here for instructions.
April 17, 2025 at 11:29 AM in reply to: nsecR – An R package for computing the No Significant Effect Concentration #4861We have put together a small R package (nsecR) that will enable you to compute a No Significant Effect Concentration from a fitted C-R model. In addition, the program will instantly compute a model-avereraged NSEC (or maNSEC) if additional C-R functions are specified.
The package is NOT on CRAN – it must be installed from a tar.gz file. Click here to downlaod the nsecR package.
The NSEC is very similar in spirit to the NOEC in that we wish to determine the largest concentration for which the expected response is statistically insignificant to the expected control response.
On Jan 4, 2024 I sent an email advising users not to use the CRAN version of ssdtools for obtaining model-averaged HCx estimates. Since that time I have had a number of queries asking when the issue will be fixed and what can be done in the meantime. This post addresses both questions.
What’s the timeframe for the release of an officially-endorsed version of ssdtools?
I can’t speak for the processes and timeframes for government approval of ssdtools, but what I can say is that we (the development team) have fixed the known issues affecting the computation of both point and interval model-averaged HCx estimates. We still have more work to do on ssdtools and so an update to the CRAN version is a little ways off – we estimate 2-3 months away.
What can I do in the meantime?
We have a devlopmental branch of ssdtools on Github. The latest version (v1.0.6.9009) embeds the corrections referred to in 1 above.
Please click here for instructions.
Hi Tristan. Thanks for your participation in the webinar and your questions. First, a point of clarification: In response to your question during the webinar about mixtures, I said that SSDs were usually used to derive HC/PC values for single toxicants. While this is the usual application of SSDs, I do acknowledge that they are also used for deriving GVs for mixtures of chemicals (eg. whole effluent toxicity testing) in which case ‘concentration’ is in fact a dilution and thus bounded between 0 and 100%. Although there’s nothing to stop you from fitting an SSD to this proportion data, there is a disconnect between the fact that the SSD models we use are invariably on the range (o,Inf). We (the SSD tools development team) had some discussion around this point following the webinar in which we discussed the use of the beta distribution as an obvious candidate for data measured as %dilution rather than an absolute concentration. Joe Thorley suggested that ssdtools was not currently set up to handle such bounded data. My suggestion, which we’ll pursue further, and requires no modification to exisiting software, is to transform the %diltion data first using: Y = X/(1 – X) where X is the diltion data (on the scale 0 to 1). The resulting Y values are then on the range (0, Inf) and thus conformable with ‘regular’ concnetration data and SSD models. Once the SSD model(s) have been fitted and inference made on Y, we back-transform using X = Y/(1 + Y) to make inference on the original scale.
The second point I wish to make with respect to SSD modelling for mixtures of toxicants is the implicit, but very strong assumption that the chemical composition of the effluent is spatially and temporally invariant. This is a big ask and almost certainly untrue. How do effectively deal with this situation (short of repeating the SSD modelling at regular times and places) is an open question and one we perhaps should also turn our attention to.
Finally, with respect to a ‘turnkey’ software replacement for CETIS/Toxcalc – we’re currently working on new developments in this space – for example, the R package Bayesnec developed by a team including Rebecca and myself, is publicly available on the CRAN website. Furthermore, we recently published a paper in ET&C which introduced the No Significant Effect Concentration (NSEC) and a further publication in IEAM will appear soon. Our challenge now is to integrate these and other aspects of contemporary CR modelling under one unified system/package and, like ssdtools, develop a user-friendly shiny web deployment. This is all food for thought and obviously someone needs to bankroll the task!
Cheers,
David
-
This reply was modified 1 year, 11 months ago by
david.fox.
Hi Emily – you are most welcome. It is indeed a team effort and there have been many talented people who have, and will contine to, contribute to this project.
Please come back often to check on new posts and material.
Cheers,
David
-
This reply was modified 1 year, 11 months ago by
david.fox.
-
This reply was modified 1 year, 11 months ago by
-
AuthorPosts