Developer’s Log with Jesse Ireland — Volume 2 (August 2023)

4 min readAug 15, 2023

Hey Modefi Community,

Happy to join you once again to give some insights in the day to day development process. In the last developer log, I shared with you the first successful end-to-end test under the new smart contract system.

With the remote team integrating the front end with the new contracts, I’ve taken the opportunity to take a step back and take a broader view of the code.

Looking at the Bigger Picture

Sometimes it can be difficult to see the forest for the trees, and viewing the code from a little further back allows me to spot some bugs that unit tests can miss and that wouldn’t show up until more formal end-to-end testing is carried out.

One example is permissions or precondition checks. When two contracts interact, it is important that one (or both) of them ensure that the address calling the contract(s) is authorized to do so. If both contracts assume that the other one has performed this check, problems arise. Reviewing both specific code and the overall structure of the smart contract system can allow me to discover and patch these types of errors prior to end-to-end testing.

Another advantage of reviewing the code after taking some time away from it is gaining a broader perspective. Sometimes, it might be challenging to remember why I wrote something in a particular way. If it’s not immediately clear to me, it’s likely to be confusing for the auditors when they review the contracts. To enhance clarity, adding a few lines of internal documentation can be beneficial. This is particularly important when the need for bytecode optimization overrides stylistic considerations.

Optimizations (Math Alert!)

When reviewing the code, I actively search for areas that can be optimized. One such optimization involves examining mathematical operations.

In Solidity 0.8, all math is automatically checked for overflow and underflow. Overflow occurs when the result of an operation is too high to represent, and underflow happens when it’s too low. While this safety feature is useful, it can be excessive in some cases, leading to wasted gas and occupying valuable bytecode space. This inefficiency is present whenever iterating over a collection, such as a set of data requests or users to pay.

Turning this feature off for certain loops does not present any danger. For instance, if a user tried to request 2²⁵⁶ datasets, we would face many other problems before hitting overflow conditions. It might be beneficial for future versions of the language to allow us to specify iteration over a fixed range to mitigate this issue, offering both efficiency and clarity in the code.

Bug Fix Case Study 1: Revoking the Creator’s Ability to Dispute Data

The remote team identified a bug while integrating the new contracts with the front end, where the request creator’s permission to dispute data was mistakenly revoked.

Here’s what happened:

  • By default, a request creator can dispute any data they request.
  • If the creator stakes for their dataset and then marks the data as unavailable, their permission to dispute was revoked, just like other users. This was not the intended behavior.
  • A solution was needed to check if the withdrawing user is the creator.
  • Initially, the RequestDetails contract stored the creator’s address because fewer contracts needed it. As the system evolved, this method became suboptimal.

The fix:

  • Storing the creator’s address in the ODDelegator contract instead saved gas on various operations that passed through the delegator.
  • It also reduced the size of the ODDelegator contract.
  • The code was further streamlined by minimizing internal function calls.

With just about 5 lines of added code, this change not only fixed the issue but also resulted in gas and bytecode size savings, leading to a more efficient implementation.

Bug Fix Case Study 2: The Problematic Batch Time

An overlooked edge case led to a minor bug that caused inconvenience but did not break anything. Here’s what happened:

When requesting a batch of data, the request creator can:

  • Set a default time for all datasets.
  • Override that time by setting individual times for specific datasets.

The issue occurred when:

  • A user overrode the default time for all the datasets.
  • The default batch time was set lower than all the datasets.
  • The user then tried to add more datasets between the default time and the time the first dataset would be open for staking.

Why was this problematic?

  • A request creator can add datasets until staking opens for the first dataset in their request batch. Adding datasets changes the base pay and collateral, so it’s not fair to change the payout after staking begins.
  • The existing algorithm didn’t consider the possibility that the default time wouldn’t apply to any datasets. This meant that it could wrongly prevent a creator from adding datasets, even though the time was irrelevant.
  • While the creator could change the batch time to a future date, that would require an extra transaction and might not align with their intentions. They might want new datasets to start immediately and use the default time.

The solution:

  • It was relatively simple to detect if the batch time had been overridden by all datasets.
  • Therefore, the necessary changes were small and localized, effectively resolving the issue.

What’s Next?

As the documentation and optimization phase is nearing completion, the upcoming step for the smart contracts involves reintegrating other data types and composing a compact set of end-to-end tests. These tests will exercise various system components such as staking, withdrawing, delaying, adding datasets, disputes, and more.

At this stage, the tests don’t need to be comprehensive or exhaustive. The primary goal is to identify any significant issues that may arise from integrating the contracts together as a unified system.

This integration marks the stage where we previously opened the beta for the Mod Squad for the first version of the ODO. With only a few tasks remaining, we are swiftly approaching the next release.




Building the foundation for real world adoption of Oracles and DeFi