Developer’s Log with Jesse Ireland, Volume 5 (November 2023)

4 min readNov 13, 2023

This is the second article in our four part “Modefi Update Series”.

Article 1: Project Evolution and Tokenomic Changes (published November 3)

Article 2: You’re reading it!

Article 3: Deep dive on the ODO’s features (upcoming)

Article 4: Unveiling rebrand and updated roadmap (upcoming)

Welcome back to another Developer’s Log! The changes announced on November 3 were controversial, which is understandable. I would like to thank our community members who have expressed their support, and want to emphasize that the development and deployment of quality products remains of paramount importance to both the team and the long term success of this project.

The TL;DR for the update is the On-Demand Oracle contracts have been deployed to FTM Testnet for final verification and front-end integration. In the rest of this log, we’ll be taking a look into the changes made in the past few weeks and what’s needed before beta testing and deployment to mainnet.

Token Removal

Since the final call was made to replace MOD’s function in the On Demand Oracle with each blockchain’s native token, development has been moving swiftly. We are much closer to being able to deploy to mainnet than we were previously, as the tokenomic change has sidestepped one of the last major hurdles.

One of the biggest challenges when using MOD within the ODO was getting accurate price data. Price aggregation algorithms are only as good as the data on which they operate, and MOD’s limited number of price feeds and relatively low liquidity (when compared to assets like FTM, MATIC, ETH, etc.) made the data less reliable.

Further complicating things was the fact MOD tokens needed to be present on every chain that the ODO was going to be deployed to, making it far more difficult to expand the product’s reach in the future.

The shift to blockchain native tokens solved these problems, and also came with the additional benefit of creating more space within the smart contracts to implement additional features.

Accounting for Volatility in Collateral

A key feature that was added to the ODO is the ability to adjust dataset payouts based on changes in the underlying collateral’s value. When collateral prices rise, payouts increase, improving data provision chances. However, if asset prices drop, payouts become less valuable, potentially causing datasets to be ignored and eventually abandoned.

This issue previously left creators with no choice but to wait or cancel the dataset. Now, creators have the option to increase payouts. Initially, I designed the feature to boost all remaining datasets, but this solution lacked precision. I revised the design, allowing creators to specify which datasets receive increased pay, addressing these concerns. This flexible solution enables creators to prioritize individual datasets, entire batches, or even add tips post-validation.

Another issue was handling incorrect currency amounts when requesting data. We lacked the ability to detect if the received price from a DEX was reasonable. Now, we can apply slippage bounds to the expected amount sent and revert the transaction if it exceeds these bounds. This is another situation where the quality and availability of on-chain price data for blockchain native assets was a big plus. The code that handles this process has already been completed, and was another important step towards delivering a finished product.

Next Steps

With the above changes made, the contract system is ready for the next phase of beta testing. Once the front end is completed we’ll be ready to start testing.

This doesn’t mean that there’s nothing left to do, however, as I have already been looking ahead to new tasks:

  • Additional testing to ensure that delays and disputes are properly accounted for, and no one is being slashed incorrectly.
  • Using developer tools like Slither to detect potential gaps in the testing process.
  • Designing and running ‘adversarial’ tests which try to break the system, improving overall security.

Additionally, I anticipate finding more optimization opportunities. For instance, I recently identified one in the ‘CostComputer’ contract. I realized I could precompute and store an important value that is calculated based on four inputs, and only recompute it if one of the inputs actually changes. This optimization led to a 4x reduction in the gas consumed during this process, due to the costly nature of storage reads on EVM chains.


I hope you enjoyed another peek into the development of the ODO. This was the second article in the four-part update series, so we’ve got two left.

The first article outlined the broader plan, and this article discussed the current state and beta readiness of the ODO. In the next article, I’ll be doing a deep-dive on all ODO system features, outlining its feature set and going into more depth about the consensus model and dispute system.

If that sounds a bit boring, don’t worry: the final article in this series will be from the entire Modefi team and cover the full rebrand and roadmap for the future of the project!




Building the foundation for real world adoption of Oracles and DeFi