1/ Mixing #Chainlink's high-quality data with low-quality data from unproven, centralized, and insecure oracle solutions only results in a dilution of data quality
Decentralization of oracles without taking into account data quality is the fast path to an exploit occuring
Decentralization of oracles without taking into account data quality is the fast path to an exploit occuring
2/ While I understand the motivation of such an approach, the reality is that mixing data from oracles of varrying quality is a significant attack vector
@AlphaFinanceLab, you should heavily reconsider using such a solution, and I'll provide additional context as to why
@AlphaFinanceLab, you should heavily reconsider using such a solution, and I'll provide additional context as to why
3/ The core motivation comes from a desire to achieve oracle security the same way that blockchains do: decentralization
This is an approach the Chainlink framework has been designed around since day one: creating decentralized oracle networks using independent nodes
This is an approach the Chainlink framework has been designed around since day one: creating decentralized oracle networks using independent nodes
4/ However, unlike blockchain nodes which all perform the same deterministic actions to achieve consensus (verifying signatures and hashes), oracles aim to achieve consensus on non-deterministic data from the messy real world
Very different problem with different considerations
Very different problem with different considerations
5/ Therefore, when intregrating an oracle solution, developers shouldn't just look at decentralization, but also ask additonal questions
How is the data sourced?
Where is data sourced from?
Who runs these nodes?
What are the cryptoeconomics?
How is market coverage achieved?
etc
How is the data sourced?
Where is data sourced from?
Who runs these nodes?
What are the cryptoeconomics?
How is market coverage achieved?
etc
6/ Looking just at the number of oracle nodes does not answer these questions
Not all oracles are created equal, and if you don't take this into account, you will be directly diluting both the data quality and security of your application https://blog.chain.link/the-importance-of-data-quality-for-defi/
Not all oracles are created equal, and if you don't take this into account, you will be directly diluting both the data quality and security of your application https://blog.chain.link/the-importance-of-data-quality-for-defi/
7/ For example, when you look at Band Protocol, you see many issues:
-A single centralized cloud function is used by all nodes for data queries
-A single relayer is used for data delivery to other chains
-A lack of market coverage by mixing data from aggregators and exchanges
-A single centralized cloud function is used by all nodes for data queries
-A single relayer is used for data delivery to other chains
-A lack of market coverage by mixing data from aggregators and exchanges
8/ Mixing this low-quality data with Chainlink's high-quality data has the opposite effect you want, it increases your risk exposure significantly
You can find more information and context about these issues in the thread below https://twitter.com/ChainLinkGod/status/1390779994599612420?s=19
You can find more information and context about these issues in the thread below https://twitter.com/ChainLinkGod/status/1390779994599612420?s=19
9/ One advantage you might think exists when mixing data from multiple oracle solutions is that it prevents black swan issues
But this is something the Chainlink network already protects against today through client diversity and circuit breakers https://blog.chain.link/circuit-breakers-and-client-diversity-within-the-chainlink-network/
But this is something the Chainlink network already protects against today through client diversity and circuit breakers https://blog.chain.link/circuit-breakers-and-client-diversity-within-the-chainlink-network/
10/ Each Chainlink node operates two software clients in parallel
A primary node running the latest OCR client updating price feeds continously
A backup node running the previous the time-tested but more expensive FluxMonitor client updating a set of backup feeds once a day
A primary node running the latest OCR client updating price feeds continously
A backup node running the previous the time-tested but more expensive FluxMonitor client updating a set of backup feeds once a day
11/ If there were ever an issue with the OCR client, the price feed proxy contracts can seamlessly switch over to the backup feeds based on the FluxMonitor, with the update frequency increased, so zero downtime
This prevents reliance on any single software implementation
This prevents reliance on any single software implementation
12/ In additon, Chainlink provides circuit breakers which can be used in various ways to prefer safety over liveness when a value from a Chainlink price feed significantly deviates from a previous update or another source
13/ Circuit breakers don't mix data, but use Chainlink price feeds as the primary oracle solution to feed data to contacts, where a secondary oracle solution (or previous update values) is used to check for Chainlink price feed deviations and pause consumption of data if needed
14/ This provides the best of both worlds, dApps get access to a high quality and reliable source of external data in the form of Chainlink data feeds, with the support for a safety net which can trigger application specific logic during deviations (e.g. pause the contract)
15/ This is the approach I would recommend Alpha Finance to go with, instead of mixing data from projects of varrying quality
When billions of user funds are at stake, oracle security becomes of the utmost priority to ensuring exploits do not occur
When billions of user funds are at stake, oracle security becomes of the utmost priority to ensuring exploits do not occur
16/ I hope this thread provided some context in the field of oracle network security so you know why mixing data across oracle solutions introduces many issues