“This is not a speculation, nor is it a hedge. This was a deliberate corporate strategy to adopt a bitcoin standard.”Let's unpack it and jump into the economics Bitcoin:
The essential measures that ended hyperinflation in each of Germany,Austria, Hungary, and Poland were, first, the creation of an independentcentral bank that was legally committed to refuse the government'sdemand or additional unsecured credit and, second, a simultaneousalteration in the fiscal policy regime.In english: *hyperinflation stops when the central bank can say "no" to the government."
The Federal Reserve is prepared to use its full range of tools to support the flow of credit to households and businesses and thereby promote its maximum employment and price stability goals.In english: We're going to keep printing and lowering rates until jobs are back and inflation is under control. If we print until the sun is blotted out, we'll print in the shade.
“We feel pretty confident that Bitcoin is less risky than holding cash, less risky than holding gold,” MicroStrategy CEO said in an interview"BTC is less risky than holding cash or gold long term" is nonsense. We saw before that BTC is more volatile on face value, and that as long as the Fed isn't run by spider monkeys stacked in a trench coat, the inflation is likely to be within reasonable bounds.
In this week's edition of DDDD (Data-driven DD), I'll be going over the real reason why we have been seeing a rally for the past few weeks, defying all logic and fundamentals - retail investors. We'll look into several data sets to see how retail interest in stock markets have reached record levels in the past few weeks, how this affected stock prices, and why we've most likely seen the top at this point, unless we see one of the "positive catalysts" that I mentioned in my previous post, which is unlikely (except for more news about Remdesivir).submitted by ASoftEngStudent to wallstreetbets [link] [comments]
Disclaimer - This is not financial advice, and a lot of the content below is my personal opinion. In fact, the numbers, facts, or explanations presented below could be wrong and be made up. Don't buy random options because some person on the internet says so; look at what happened to all the SPY 220p 4/17 bag holders. Do your own research and come to your own conclusions on what you should do with your own money, and how levered you want to be based on your personal risk tolerance.
Most people who know me personally know that I spend an unhealthy amount of my free time in finance and trading as a hobby, even competing in paper options trading competitions when I was in high school. A few weeks ago, I had a friend ask if he could call me because he just installed Robinhood and wanted to buy SPY puts after seeing everyone on wallstreetbets post gains posts from all the tendies they’ve made from their SPY puts. The problem was, he actually didn’t understand how options worked at all, and needed a thorough explanation about how options are priced, what strike prices and expiration dates mean, and what the right strategy to buying options are. That’s how I knew we were at the euphoria stage of buying SPY puts - it’s when dumb money starts to pour in, and people start buying securities because they see everyone else making money and they want in, even if they have no idea what they’re buying, and price becomes dislocated from fundementals. Sure enough, less than a week later, we started the bull rally that we are currently in. Bubbles are formed when people buy something not because of logic or even gut feeling, but when people who previously weren’t involved see their dumb neighbors make tons of money from it, and they don’t want to miss out.
A few days ago, I started getting questions from other friends about what stocks they should buy and if I thought something was a good investment. That inspired me to dig a bit deeper to see how many other people are thinking the same thing.
Ever since March, we’ve seen an unprecedented amount of money pour into the stock market from retail investors.
Google Search Trends
\"what stock should I buy\" Google Trends 2004 - 2020
\"what stock should I buy\" Google Trends 12 months
\"stocks\" Google Trends 2004 - 2020
\"stocks\" Google Trends 12 months
Robinhood SPY holders
\"Robinhood\" Google Trends 12 months
wallstreetbets' favorite broker Google Trends 12 months
Excerpt from E*Trade earnings statement
Excerpt from Schwab earnings statement
TD Ameritrade Excerpt
cnbc.com Alexa rank
CNBC viewership & rankings
wallstreetbets comments / day
investing comments / day
What we can see from Reddit numbers, Google Trends, and CNBC stats is that in between the first week of March and first week of April, we see a massive inflow of retail interest in the stock market. Not only that, but this inflow of interest is coming from all age cohorts, from internet-using Zoomers to TV-watching Boomers. Robinhood SPY holdings and earnings reports from E*Trade, TD Ameritrade, and Schwab have also all confirmed record numbers of new clients, number of trades, and assets. There’s something interesting going on if you look closer at the numbers. The numbers growth in brokers for designed for “less sophisticated” investors (i.e. Robinhood and E*Trade) are much larger than for real brokers (i.e. Schwab and Ameritrade). This implies that the record number of new users and trade volume is coming from dumb money. The numbers shown here only really apply to the US and Canada, but there’s also data to suggest that there’s also record numbers of foreign investors pouring money into the US stock market as well.
However, after the third week of March, we see the interest start to slowly decline and plateau, indicating that we probably have seen most of those new investors who wanted to have a long position in the market do so.
Pretty much everything past this point is purely speculation, and isn’t really backed up by any solid data so take whatever I say here with a cup of salt. We could see from the graph that new investor interest started with the first bull trap we saw in the initial decline from early March, and peaking right after the end of the crash in March. So it would be fair to guess that we’re seeing a record amount of interest in the stock market from a “buy the dip” mentality, especially from Robinhood-using Millennials. Here’s a few points on my rationalization of this behavior, based on very weak anecdotal evidence
Sentiment & Magic Crayons
As I mentioned previously, this bull rally will keep going until enough bears convert to bulls. Markets go up when the amount of new bullish positions outnumber the amount of new bearish positions, and vice versa. Record amounts of new investors, who previously never held a position in the market before, fueled the bullish side of this equation, despite all the negative data that has come out and dislocating the price from fundamentals. All the smart money that was shorting the markets saw this happening, and flipped to become bulls because you don’t fight the trend, even if the trend doesn’t reflect reality.
From the data shown above, we can see new investor interest growth has started declining since mid March and started stagnating in early April. The declining volume in SPY since mid-March confirms this. That means, once the sentiment of the new retail investors starts to turn bearish, and everyone figures out how much the stocks they’re holding are really worth, another sell-off will begin. I’ve seen something very similar to this a few years ago with Bitcoin. Near the end of 2017, Bitcoin started to become mainstream and saw a flood of retail investors suddenly signing up for Coinbase (i.e. Robinhood) accounts and buying Bitcoin without actually understanding what it is and how it works. Suddenly everyone, from co-workers to grandparents, starts talking about Bitcoin and might have thrown a few thousand dollars into it. This appears to be a very similar parallel to what’s going on right now. Of course there’s differences here in that equities have an intrinsic value, although many of them have gone way above what they should be intrinsically worth, and the vast majority of retail investors don’t understand how to value companies. Then, during December, when people started thinking that the market was getting a bit overheated, some started taking their profits, and that’s when the prices crashed violently. This flip in sentiment now look like it has started with equities.
Technical Analysis, or magic crayons, is a discipline in finance that uses statistical analysis to predict market trends based on market sentiment. Of course, a lot of this is hand-wavy and is very subjective; two people doing TA on the same price history can end up getting opposite results, so TA should always be taken with a grain of salt and ideally be backed with underlying justification and not be blindly followed. In fact, I’ve since corrected the ascending wedge I had on SPY since my last post since this new wedge is a better fit for the new trading data.
There’s a few things going on in this chart. The entire bull rally we’ve had since the lows can be modelled using a rising wedge. This is a pattern where there is a convergence of a rising support and resistance trendline, along with falling volume. This indicates a slow decline in net bullish sentiment with investors, with smaller and smaller upside after each bounce off the support until it hits a resistance. The smaller the bounces, the less bullish investors are. When the bearish sentiment takes over across investors, the price breaks below this wedge - a breakdown, and indicates a start of another downtrend.
This happened when the wedge hit resistance at around 293, which is around the same price as the 200 day moving average, the 62% retracement (considered to be the upper bound of a bull trap), and a price level that acted as a support and resistance throughout 2019. The fact that it gapped down to break this wedge is also a strong signal, indicating a sudden swing in investor sentiment overnight. The volume of the break down also broke the downwards trend of volume we’ve had since the beginning of the bull rally, indicating a sudden surge of people selling their shares. This doesn’t necessarily mean that we will go straight from here, and I personally think that we will see the completion of a heads-and-shoulders pattern complete before SPY goes below 274, which in itself is a strong support level. In other words, SPY might go from 282 -> 274 -> 284 -> 274 before breaking the 274 support level.
Doing TA is already sketchy, and doing TA on something like VIX is even more sketchy, but I found this interesting so I’ll mention it. Since the start of the bull rally, we’ve had VIX inside a descending channel. With the breakdown we had in SPY yesterday, VIX has also gapped up to have a breakout from this channel, indicating that we may see future volatility in the next week or so.
Putting Everything Together
Finally, we get to my thesis. This entire bull rally has been fueled by new retail investors buying the dip, bringing the stock price to euphoric levels. Over the past few weeks, we’ve been seeing the people waiting at the sidelines for years to get into the stock market slowly FOMO into the rally in smaller and smaller volumes, while the smart money have been locking in their profits at an even slower rate - hence an ascending wedge. As the amount of new retail interest in the stock market started slowed down, the amount of new bulls started to decline. It looks like Friday might have been the start of the bearish sentiment taking over, meaning it’s likely that 293 was the top, unless any significant bullish events happen in the next two weeks like a fourth round of stimulus, in which case we might see 300. This doesn’t mean we’ll instantly go back to circuit breakers on Monday, and we might see 282 -> 274 -> 284 -> 274 happen before panic, this time by the first-time investors, eventually bringing us down towards SPY 180.
tldr; we've reached the top
EDIT - I'll keep a my live thoughts here as we move throughout this week in case anyone's still reading this and interested.
5/4 8PM - /ES was red last night but steadily climbed, which was expected since 1h RSI was borderline oversold, leaving us to a slightly green day. /ES looks like it has momentum going up, but is approaching towards overbought territory now. Expecting it to go towards 284 (possibly where we'll open tomorrow) and bouncing back down from that price level
5/5 Market Open - Well there goes my price target. I guess at this point it might go up to 293 again, but will need a lot of momentum to push back there to 300. Seems like this is being driven by oil prices skyrocketing.
5/5 3:50PM - Volume for the upwards price action had very little volume behind it. Seeing a selloff EOD today, could go either way although I have a bearish bias. Going to hold cash until it goes towards one end of the 274-293 channel (see last week's thesis). Still believe that we will see it drop below 274 next week, but we might be moving sideways in the channel this week and a bit of next week before that happens. Plan for tomorrow is buy short dated puts if open < 285. Otherwise, wait till it goes to 293 before buying those puts
5/5 6PM - What we saw today could be a false breakout above 284. Need tomorrow to open below 285 for that to be confirmed. If so, my original thesis of it going back down to 274 before bouncing back up will still be in play.
5/6 EOD - Wasn't a false breakout. Looks like it's still forming the head-and-shoulders pattern mentioned before, but 288 instead of 284 as the level. Still not sure yet so I'm personally going to be holding cash and waiting this out for the next few days. Will enter into short positions if we either go near 293 again or drop below 270. Might look into VIX calls if VIX goes down near 30.
5/7 Market Open - Still waiting. If we break 289 we're probably heading to 293. I'll make my entry to short positions when we hit that a second time. There's very little bullish momentum left (see MACD 1D), so if we hit 293 and then drop back down, we'll have a MACD crossover event which many traders and algos use as a sell signal. Oil is doing some weird shit.
5/7 Noon - Looks like we're headed to 293. Picked up VIX 32.5c 5/27 since VIX is near 30.
5/7 11PM - /ES is hovering right above 2910, with 4h and 1h charts are bullish from MACD and 1h is almost overbought in RSI. Unless something dramatic happens we'll probably hit near 293 tomorrow, which is where I'll get some SPY puts. We might drop down before ever touching it, or go all the way to 295 (like last time) during the day, but expecting it to close at or below 293. After that I'm expecting a gap down Monday as we start the final leg down next week towards 274. Expecting 1D MACD to crossover in the final leg down, which will be a signal for bears to take over and institutions / day traders will start selling again
5/8 Market Open - Plan is to wait till a good entry today, either when technicals looks good or we hit 293, and then buy some SPY June 285p and July 275p
5/8 Noon - Everything still going according to plan. Most likely going to slowly inch towards 293 by EOD. Will probably pick up SPY puts and more VIX calls at power hour (3 - 4PM). Monday will probably gap down, although there's a small chance of one more green / sideways day before that happens if we have bullish catalysts on the weekend.
5/8 3:55PM - SPY at 292.60. This is probably going to be the closest we get to 293. Bought SPY 290-260 6/19 debit spreads and 292-272 5/15 debit spreads, as well as doubling down on VIX calls from yesterday, decreasing my cost basis. Still looks like there's room for one more green day on Monday, so I left some money on the side to double down if that's the case, although it's more likely than not we won't get there.
5/8 EOD - Looks like we barely touched 293 exactly AH before rebounding down. Too bad you can't buy options AH, but more convinced we'll see a gap down on Monday. Going to work on another post over the weekend and do my updates there. Have a great weekend everyone!
submitted by EpiK-Protocol to u/EpiK-Protocol [link] [comments]
On October 20, Eric Yao, Head of EpiK China, and Leo, Co-Founder & CTO of EpiK, visited Deep Chain Online Salon, and discussed “How EpiK saved the miners eliminated by Filecoin by launching E2P storage model”. ‘?” The following is a transcript of the sharing.
Eric: Hello, everyone, I’m Eric, graduated from School of Information Science, Tsinghua University. My Master’s research was on data storage and big data computing, and I published a number of industry top conference papers.
Since 2013, I have invested in Bitcoin, Ethereum, Ripple, Dogcoin, EOS and other well-known blockchain projects, and have been settling in the chain circle as an early technology-based investor and industry observer with 2 years of blockchain experience. I am also a blockchain community initiator and technology evangelist
Leo: Hi, I’m Leo, I’m the CTO of EpiK. Before I got involved in founding EpiK, I spent 3 to 4 years working on blockchain, public chain, wallets, browsers, decentralized exchanges, task distribution platforms, smart contracts, etc., and I’ve made some great products. EpiK is an answer to the question we’ve been asking for years about how blockchain should be landed, and we hope that EpiK is fortunate enough to be an answer for you as well.
Q & A
Deep Chain Finance:
First of all, let me ask Eric, on October 15, Filecoin’s main website launched, which aroused everyone’s attention, but at the same time, the calls for fork within Filecoin never stopped. The EpiK protocol is one of them. What I want to know is, what kind of project is EpiK Protocol? For what reason did you choose to fork in the first place? What are the differences between the forked project and Filecoin itself?
First of all, let me answer the first question, what kind of project is EpiK Protocol.
With the Fourth Industrial Revolution already upon us, comprehensive intelligence is one of the core goals of this stage, and the key to comprehensive intelligence is how to make machines understand what humans know and learn new knowledge based on what they already know. And the knowledge graph scale is a key step towards full intelligence.
In order to solve the many challenges of building large-scale knowledge graphs, the EpiK Protocol was born. EpiK Protocol is a decentralized, hyper-scale knowledge graph that organizes and incentivizes knowledge through decentralized storage technology, decentralized autonomous organizations, and generalized economic models. Members of the global community will expand the horizons of artificial intelligence into a smarter future by organizing all areas of human knowledge into a knowledge map that will be shared and continuously updated for the eternal knowledge vault of humanity
And then, for what reason was the fork chosen in the first place?
EpiK’s project founders are all senior blockchain industry practitioners and have been closely following the industry development and application scenarios, among which decentralized storage is a very fresh application scenario.
However, in the development process of Filecoin, the team found that due to some design mechanisms and historical reasons, the team found that Filecoin had some deviations from the original intention of the project at that time, such as the overly harsh penalty mechanism triggered by the threat to weaken security, and the emergence of the computing power competition leading to the emergence of computing power monopoly by large miners, thus monopolizing the packaging rights, which can be brushed with computing power by uploading useless data themselves.
The emergence of these problems will cause the data environment on Filecoin to get worse and worse, which will lead to the lack of real value of the data in the chain, high data redundancy, and the difficulty of commercializing the project to land.
After paying attention to the above problems, the project owner proposes to introduce multi-party roles and a decentralized collaboration platform DAO to ensure the high value of the data on the chain through a reasonable economic model and incentive mechanism, and store the high-value data: knowledge graph on the blockchain through decentralized storage, so that the lack of value of the data on the chain and the monopoly of large miners’ computing power can be solved to a large extent.
Finally, what differences exist between the forked project and Filecoin itself?
On the basis of the above-mentioned issues, EpiK’s design is very different from Filecoin, first of all, EpiK is more focused in terms of business model, and it faces a different market and track from the cloud storage market where Filecoin is located because decentralized storage has no advantage over professional centralized cloud storage in terms of storage cost and user experience.
EpiK focuses on building a decentralized knowledge graph, which reduces data redundancy and safeguards the value of data in the distributed storage chain while preventing the knowledge graph from being tampered with by a few people, thus making the commercialization of the entire project reasonable and feasible.
From the perspective of ecological construction, EpiK treats miners more friendly and solves the pain point of Filecoin to a large extent, firstly, it changes the storage collateral and commitment collateral of Filecoin to one-time collateral.
Miners participating in EpiK Protocol are only required to pledge 1000 EPK per miner, and only once before mining, not in each sector.
What is the concept of 1000 EPKs, you only need to participate in pre-mining for about 50 days to get this portion of the tokens used for pledging. The EPK pre-mining campaign is currently underway, and it runs from early September to December, with a daily release of 50,000 ERC-20 standard EPKs, and the pre-mining nodes whose applications are approved will divide these tokens according to the mining ratio of the day, and these tokens can be exchanged 1:1 directly after they are launched on the main network. This move will continue to expand the number of miners eligible to participate in EPK mining.
Secondly, EpiK has a more lenient penalty mechanism, which is different from Filecoin’s official consensus, storage and contract penalties, because the protocol can only be uploaded by field experts, which is the “Expert to Person” mode. Every miner needs to be backed up, which means that if one or more miners are offline in the network, it will not have much impact on the network, and the miner who fails to upload the proof of time and space in time due to being offline will only be forfeited by the authorities for the effective computing power of this sector, not forfeiting the pledged coins.
If the miner can re-submit the proof of time and space within 28 days, he will regain the power.
Unlike Filecoin’s 32GB sectors, EpiK’s encapsulated sectors are smaller, only 8M each, which will solve Filecoin’s sector space wastage problem to a great extent, and all miners have the opportunity to complete the fast encapsulation, which is very friendly to miners with small computing power.
The data and quality constraints will also ensure that the effective computing power gap between large and small miners will not be closed.
Finally, unlike Filecoin’s P2P data uploading model, EpiK changes the data uploading and maintenance to E2P uploading, that is, field experts upload and ensure the quality and value of the data on the chain, and at the same time introduce the game relationship between data storage roles and data generation roles through a rational economic model to ensure the stability of the whole system and the continuous high-quality output of the data on the chain.
Deep Chain Finance:
Eric, on the eve of Filecoin’s mainline launch, issues such as Filecoin’s pre-collateral have aroused a lot of controversy among the miners. In your opinion, what kind of impact will Filecoin bring to itself and the whole distributed storage ecosystem after it launches? Do you think that the current confusing FIL prices are reasonable and what should be the normal price of FIL?
Filecoin mainnet has launched and many potential problems have been exposed, such as the aforementioned high pre-security problem, the storage resource waste and computing power monopoly caused by unreasonable sector encapsulation, and the harsh penalty mechanism, etc. These problems are quite serious, and will greatly affect the development of Filecoin ecology.
These problems are relatively serious, and will greatly affect the development of Filecoin ecology, here are two examples to illustrate. For example, the problem of big miners computing power monopoly, now after the big miners have monopolized computing power, there will be a very delicate state — — the miners save a file data with ordinary users. There is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. And after the big miners have monopolized computing power, there will be a very delicate state — — the miners will save a file data with ordinary users, there is no way to verify this matter in the chain, whether what he saved is uploaded by himself or someone else. Because I can fake another identity to upload data for myself, but that leads to the fact that for any miner I go to choose which data to save. I have only one goal, and that is to brush my computing power and how fast I can brush my computing power.
There is no difference between saving other people’s data and saving my own data in the matter of computing power. When I save someone else’s data, I don’t know that data. Somewhere in the world, the bandwidth quality between me and him may not be good enough.
The best option is to store my own local data, which makes sense, and that results in no one being able to store data on the chain at all. They only store their own data, because it’s the most economical for them, and the network has essentially no storage utility, no one is providing storage for the masses of retail users.
The harsh penalty mechanism will also severely deplete the miner’s profits, because DDOS attacks are actually a very common attack technique for the attacker, and for a big miner, he can get a very high profit in a short period of time if he attacks other customers, and this thing is a profitable thing for all big miners.
Now as far as the status quo is concerned, the vast majority of miners are actually not very well maintained, so they are not very well protected against these low-DDOS attacks. So the penalty regime is grim for them.
The contradiction between the unreasonable system and the demand will inevitably lead to the evolution of the system in a more reasonable direction, so there will be many forked projects that are more reasonable in terms of mechanism, thus attracting Filecoin miners and a diversion of storage power.
Since each project is in the field of decentralized storage track, the demand for miners is similar or even compatible with each other, so miners will tend to fork the projects with better economic benefits and business scenarios, so as to filter out the projects with real value on the ground.
For the chaotic FIL price, because FIL is also a project that has gone through several years, carrying too many expectations, so it can only be said that the current situation has its own reasons for existence. As for the reasonable price of FIL there is no way to make a prediction because in the long run, it is necessary to consider the commercialization of the project to land and the value of the actual chain of data. In other words, we need to keep observing whether Filecoin will become a game of computing power or a real value carrier.
Deep Chain Finance:
Leo, we just mentioned that the pre-collateral issue of Filecoin caused the dissatisfaction of miners, and after Filecoin launches on the main website, the second round of space race test coins were directly turned into real coins, and the official selling of FIL hit the market phenomenon, so many miners said they were betrayed. What I want to know is, EpiK’s main motto is “save the miners eliminated by Filecoin”, how to deal with the various problems of Filecoin, and how will EpiK achieve “save”?
Originally Filecoin’s tacit approval of the computing power makeup behavior was to declare that the official directly chose to abandon the small miners. And this test coin turned real coin also hurt the interests of the loyal big miners in one cut, we do not know why these low-level problems, we can only regret.
EpiK didn’t do it to fork Filecoin, but because EpiK to build a shared knowledge graph ecology, had to integrate decentralized storage in, so the most hardcore Filecoin’s PoRep and PoSt decentralized verification technology was chosen. In order to ensure the quality of knowledge graph data, EpiK only allows community-voted field experts to upload data, so EpiK naturally prevents miners from making up computing power, and there is no reason for the data that has no value to take up such an expensive decentralized storage resource.
With the inability to make up computing power, the difference between big miners and small miners is minimal when the amount of knowledge graph data is small.
We can’t say that we can save the big miners, but we are definitely the optimal choice for the small miners who are currently in the market to be eliminated by Filecoin.
Deep Chain Finance:
Let me ask Eric: According to EpiK protocol, EpiK adopts the E2P model, which allows only experts in the field who are voted to upload their data. This is very different from Filecoin’s P2P model, which allows individuals to upload data as they wish. In your opinion, what are the advantages of the E2P model? If only voted experts can upload data, does that mean that the EpiK protocol is not available to everyone?
First, let me explain the advantages of the E2P model over the P2P model.
There are five roles in the DAO ecosystem: miner, coin holder, field expert, bounty hunter and gateway. These five roles allocate the EPKs generated every day when the main network is launched.
The miner owns 75% of the EPKs, the field expert owns 9% of the EPKs, and the voting user shares 1% of the EPKs.
The other 15% of the EPK will fluctuate based on the daily traffic to the network, and the 15% is partly a game between the miner and the field expert.
The first describes the relationship between the two roles.
The first group of field experts are selected by the Foundation, who cover different areas of knowledge (a wide range of knowledge here, including not only serious subjects, but also home, food, travel, etc.) This group of field experts can recommend the next group of field experts, and the recommended experts only need to get 100,000 EPK votes to become field experts.
The field expert’s role is to submit high-quality data to the miner, who is responsible for encapsulating this data into blocks.
Network activity is judged by the amount of EPKs pledged by the entire network for daily traffic (1 EPK = 10 MB/day), with a higher percentage indicating higher data demand, which requires the miner to increase bandwidth quality.
If the data demand decreases, this requires field experts to provide higher quality data. This is similar to a library with more visitors needing more seats, i.e., paying the miner to upgrade the bandwidth.
When there are fewer visitors, more money is needed to buy better quality books to attract visitors, i.e., money for bounty hunters and field experts to generate more quality knowledge graph data. The game between miners and field experts is the most important game in the ecosystem, unlike the game between the authorities and big miners in the Filecoin ecosystem.
The game relationship between data producers and data storers and a more rational economic model will inevitably lead to an E2P model that generates stored on-chain data of much higher quality than the P2P model, and the quality of bandwidth for data access will be better than the P2P model, resulting in greater business value and better landing scenarios.
I will then answer the question of whether this means that the EpiK protocol will not be universally accessible to all.
The E2P model only qualifies the quality of the data generated and stored, not the roles in the ecosystem; on the contrary, with the introduction of the DAO model, the variety of roles introduced in the EpiK ecosystem (which includes the roles of ordinary people) is not limited. (Bounty hunters who can be competent in their tasks) gives roles and possibilities for how everyone can participate in the system in a more logical way.
For example, a miner with computing power can provide storage, a person with a certain domain knowledge can apply to become an expert (this includes history, technology, travel, comics, food, etc.), and a person willing to mark and correct data can become a bounty hunter.
The presence of various efficient support tools from the project owner will lower the barriers to entry for various roles, thus allowing different people to do their part in the system and together contribute to the ongoing generation of a high-quality decentralized knowledge graph.
Deep Chain Finance:
Leo, some time ago, EpiK released a white paper and an economy whitepaper, explaining the EpiK concept from the perspective of technology and economy model respectively. What I would like to ask is, what are the shortcomings of the current distributed storage projects, and how will EpiK protocol be improved?
Distributed storage can easily be misunderstood as those of Ali’s OceanDB, but in the field of blockchain, we should focus on decentralized storage first.
There is a big problem with the decentralized storage on the market now, which is “why not eat meat porridge”.
How to understand it? Decentralized storage is cheaper than centralized storage because of its technical principle, and if it is, the centralized storage is too rubbish for comparison.
What incentive does the average user have to spend more money on decentralized storage to store data?
Is it safer?
Existence miners can shut down at any time on decentralized storage by no means save a share of security in Ariadne and Amazon each.
There’s no difference between encrypted presence on decentralized storage and encrypted presence on Amazon.
The 10,000 gigabytes of bandwidth in decentralized storage simply doesn’t compare to the fiber in a centralized server room. This is the root problem of the business model, no one is using it, no one is buying it, so what’s the big vision.
The goal of EpiK is to guide all community participants in the co-construction and sharing of field knowledge graph data, which is the best way for robots to understand human knowledge, and the more knowledge graph data there is, the more knowledge a robot has, the more intelligent it is exponentially, i.e., EpiK uses decentralized storage technology. The value of exponentially growing data is captured with linearly growing hardware costs, and that’s where the buy-in for EPK comes in.
Organized data is worth a lot more than organized hard drives, and there is a demand for EPK when robots have the need for intelligence.
Deep Chain Finance:
Let me ask Leo, how many forked projects does Filecoin have so far, roughly? Do you think there will be more or less waves of fork after the mainnet launches? Have the requirements of the miners at large changed when it comes to participation?
We don’t have specific statistics, now that the main network launches, we feel that forking projects will increase, there are so many restricted miners in the market that they need to be organized efficiently.
However, we currently see that most forked projects are simply modifying the parameters of Filecoin’s economy model, which is undesirable, and this level of modification can’t change the status quo of miners making up computing power, and the change to the market is just to make some of the big miners feel more comfortable digging up, which won’t help to promote the decentralized storage ecology to land.
We need more reasonable landing scenarios so that idle mining resources can be turned into effective productivity, pitching a 100x coin instead of committing to one Fomo sentiment after another.
Deep Chain Finance:
How far along is the EpiK Protocol project, Eric? What other big moves are coming in the near future?
The development of the EpiK Protocol is divided into 5 major phases.
(a) Phase I testing of the network “Obelisk”.
Phase II Main Network 1.0 “Rosetta”.
Phase III Main Network 2.0 “Hammurabi”.
(a) The Phase IV Enrichment Knowledge Mapping Toolkit.
The fifth stage is to enrich the knowledge graph application ecology.
Currently in the first phase of testing network “Obelisk”, anyone can sign up to participate in the test network pre-mining test to obtain ERC20 EPK tokens, after the mainnet exchange on a one-to-one basis.
We have recently launched ERC20 EPK on Uniswap, you can buy and sell it freely on Uniswap or download our EpiK mobile wallet.
In addition, we will soon launch the EpiK Bounty platform, and welcome all community members to do tasks together to build the EpiK community. At the same time, we are also pushing forward the centralized exchange for token listing.
Some KOLs said, Filecoin consumed its value in the next few years, so it will plunge, what do you think?
First of all, the judgment of the market is to correspond to the cycle, not optimistic about the FIL first judgment to do is not optimistic about the economic model of the project, or not optimistic about the distributed storage track.
First of all, we are very confident in the distributed storage track and will certainly face a process of growth and decline, so as to make a choice for a better project.
Since the existing group of miners and the computing power already produced is fixed, and since EpiK miners and FIL miners are compatible, anytime miners will also make a choice for more promising and economically viable projects.
Filecoin consumes the value of the next few years this time, so it will plunge.
Regarding the market issues, the plunge is not a prediction, in the industry or to keep learning iteration and value judgment. Because up and down market sentiment is one aspect, there will be more very important factors. For example, the big washout in March this year, so it can only be said that it will slow down the development of the FIL community. But prices are indeed unpredictable.
Actually, in the end, if there are no applications and no one really uploads data, the market value will drop, so what are the landing applications of EpiK?
Leo: The best and most direct application of EpiK’s knowledge graph is the question and answer system, which can be an intelligent legal advisor, an intelligent medical advisor, an intelligent chef, an intelligent tour guide, an intelligent game strategy, and so on.
What is Grayscale?Grayscale is the company that created the ETHE product. Their website is https://grayscale.co/
What is ETHE?ETHE is essentially a stock that intends to loosely track the price of ETH. It does so by having each ETHE be backed by a specific amount of ETH that is held on chain. Initially, the newly minted ETHE can only be purchased by institutions and accredited investors directly from Grayscale. Once a year has passed (6 months for GBTC) it can then be listed on the OTCQX Best Market exchange for secondary trading. Once listed on OTCQX, anyone investor can purchase at this point. Additional information on ETHE can be found here.
So ETHE is an ETF?No. For technical reasons beyond my personal understandings it is not labeled an ETF. I know it all flows back to the “Securities Act Rule 144”, but due to my limited knowledge on SEC regulations I don’t want to misspeak past that. If anyone is more knowledgeable on the subject I am happy to input their answer here.
How long has ETHE existed?ETHE was formed 12/14/2017. GBTC was formed 9/25/2013.
How is ETHE created?The trust will issue shares to “Authorized Participants” in groups of 100 shares (called baskets). Authorized Participants are the only persons that may place orders to create these baskets and they do it on behalf of the investor.
How does Grayscale acquire the ETH to collateralize the ETHE product?An Investor may acquire ETHE by paying in cash or exchanging ETH already owned.
Where does Grayscale store their ETH? Does it have a specific wallet address we can follow?ETH is stored with Coinbase Custody Trust Company, LLC. I am unaware of any specific address or set of addresses that can be used to verify the ETH is actually there.
Can ETHE be redeemed for ETH?No, currently there is no way to give your shares of ETHE back to Grayscale to receive ETH back. The only method of getting back into ETH would be to sell your ETHE to someone else and then use those proceeds to buy ETH yourself.
Why are they not redeeming shares?I think the report summarizes it best:
Redemptions of Shares are currently not permitted and the Trust is unable to redeem Shares. Subject to receipt of regulatory approval from the SEC and approval by the Sponsor in its sole discretion, the Trust may in the future operate a redemption program. Because the Trust does not believe that the SEC would, at this time, entertain an application for the waiver of rules needed in order to operate an ongoing redemption program, the Trust currently has no intention of seeking regulatory approval from the SEC to operate an ongoing redemption program.Source: Redemption Procedures on page 41 of the “Grayscale Ethereum Trust Annual Report (2019)” – Located Here
What is the fee structure?ETHE has an annual fee of 2.5%. GBTC has an annual fee of 2.0%. Fees are paid by selling the underlying ETH / BTC collateralizing the asset.
What is the ratio of ETH to ETHE?At the time of posting (6/19/2020) each ETHE share is backed by .09391605 ETH. Each share of GBTC is backed by .00096038 BTC.
Why is the ratio not 1:1? Why is it always decreasing?While I cannot say for certain why the initial distribution was not a 1:1 backing, it is more than likely to keep the price down and allow more investors a chance to purchase ETHE / GBTC.
I keep hearing about how this is locked supply… explain?As noted above, there is currently no redemption program for converting your ETHE back into ETH. This means that once an ETHE is issued, it will remain in circulation until a redemption program is formed --- something that doesn’t seem to be too urgent for the SEC or Grayscale at the moment. Tiny amounts will naturally be removed due to fees, but the bulk of the asset is in there for good.
The Trust’s ETH will be transferred out of the ETH Account only in the following circumstances: (i) transferred to pay the Sponsor’s Fee or any Additional Trust Expenses, (ii) distributed in connection with the redemption of Baskets (subject to the Trust’s obtaining regulatory approval from the SEC to operate an ongoing redemption program and the consent of the Sponsor), (iii) sold on an as-needed basis to pay Additional Trust Expenses or (iv) sold on behalf of the Trust in the event the Trust terminates and liquidates its assets or as otherwise required by law or regulation.Source: Description of Trust on page 31 of the “Grayscale Ethereum Trust Annual Report (2019)” – Located Here
Grayscale now owns a huge chunk of both ETH and BTC’s supply… should we be worried about manipulation, a sell off to crash the market crash, a staking cartel?First, it’s important to remember Grayscale is a lot more akin to an exchange then say an investment firm. Grayscale is working on behalf of its investors to create this product for investor control. Grayscale doesn’t ‘control’ the ETH it holds any more then Coinbase ‘controls’ the ETH in its hot wallet. (Note: There are likely some varying levels of control, but specific to this topic Grayscale cannot simply sell [legally, at least] the ETH by their own decision in the same manner Coinbase wouldn't be able to either.)
Yes, but what about [insert criminal act here]…Alright, yes. Technically nothing is stopping Grayscale from selling all the ETH / BTC and running off to the Bahamas (Hawaii?). BUT there is no real reason for them to do so. Barry is an extremely public figure and it won’t be easy for him to get away with that. Grayscale’s Bitcoin Trust creates SEC reports weekly / bi-weekly and I’m sure given the sentiment towards crypto is being watched carefully. Plus, Grayscale is making tons of consistent revenue and thus has little to no incentive to give that up for a quick buck.
That’s a lot of ‘happy little feels’ Bob, is there even an independent audit or is this Tether 2.0?Actually yes, an independent auditor report can be found in their annual reports. It is clearly aimed more towards the financial side and I doubt the auditors are crypto savants, but it is at least one extra set of eyes. Auditors are Friedman LLP – Auditor since 2015.
The company’s auditors Friedman LLP were also coincidentally TetheBitfinex’s auditors until They controversially parted ways in 2018 when the Tether controversy was at its height. I am not suggesting for one moment that there is anything shady about DCG - I just find it interesting it’s the same auditor.
“Grayscale sounds kind of lame” / “Not your keys not your crypto!” / “Why is anyone buying this, it sounds like a scam?”Welp, for starters this honestly is not really a product aimed at the people likely to be reading this post. To each their own, but do remember just because something provides no value to you doesn’t mean it can’t provide value to someone else. That said some of the advertised benefits are as follows:
Why is there a premium? Why is ETHE’s premium so insanely high compared to GBTC’s premium?There are a handful of theories of why a premium exists at all, some even mentioned in the annual report. The short list is as follows:
Are there any other differences between ETHE and GBTC?I touched on a few of the smaller differences, but one of the more interesting changes is GBTC is now a “SEC reporting company” as of January 2020. Which again goes beyond my scope of knowledge so I won’t comment on it too much… but the net result is GBTC is now putting out weekly / bi-weekly 8-K’s and annual 10-K’s. This means you can track GBTC that much easier at the moment as well as there is an extra layer of validity to the product IMO.
I’m looking for some statistics on ETHE… such as who is buying, how much is bought, etc?There is a great Q1 2020 report I recommend you give a read that has a lot of cool graphs and data on the product. It’s a little GBTC centric, but there is some ETHE data as well. It can be found here hidden within the 8-K filings.Q1 2020 is the 4/16/2020 8-K filing.
Is Grayscale only just for BTC and ETH?No, there are other products as well. In terms of a secondary market product, ETCG is the Ethereum Classic version of ETHE. Fun Fact – ETCG was actually put out to the secondary market first. It also has a 3% fee tied to it where 1% of it goes to some type of ETC development fund.
Are there alternatives to Grayscale?I know they exist, but I don’t follow them. I’ll leave this as a “to be edited” section and will add as others comment on what they know.
Coinshares (Formerly XBT provider) are the only similar product I know of. BTC, ETH, XRP and LTC as Exchange Traded Notes (ETN).
It looks like they are fully backed with the underlying crypto (no premium).
Denominated in SEK and EUR. Certainly available in some UK pensions (SIPP).
As asked by pegcity - Okay so I was under the impression you can just give them your own ETH and get ETHE, but do you get 11 ETHE per ETH or do you get the market value of ETH in USD worth of ETHE?I have always understood that the ETHE issued directly through Grayscale is issued without the premium. As in, if I were to trade 1 ETH for ETHE I would get 11, not say only 2 or 3 because the secondary market premium is so high. And if I were paying cash only I would be paying the price to buy 1 ETH to get my 11 ETHE. Per page 39 of their annual statement, it reads as follows:
The Trust will issue Shares to Authorized Participants from time to time, but only in one or more Baskets (with a Basket being a block of 100 Shares). The Trust will not issue fractions of a Basket. The creation (and, should the Trust commence a redemption program, redemption) of Baskets will be made only in exchange for the delivery to the Trust, or the distribution by the Trust, of the number of whole and fractional ETH represented by each Basket being created (or, should the Trust commence a redemption program, redeemed), which is determined by dividing (x) the number of ETH owned by the Trust at 4:00 p.m., New York time, on the trade date of a creation or redemption order, after deducting the number of ETH representing the U.S. dollar value of accrued but unpaid fees and expenses of the Trust (converted using the ETH Index Price at such time, and carried to the eighth decimal place), by (y) the number of Shares outstanding at such time (with the quotient so obtained calculated to one one-hundred-millionth of one ETH (i.e., carried to the eighth decimal place)), and multiplying such quotient by 100 (the “Basket ETH Amount”). All questions as to the calculation of the Basket ETH Amount will be conclusively determined by the Sponsor and will be final and binding on all persons interested in the Trust. The Basket ETH Amount multiplied by the number of Baskets being created or redeemed is the “Total Basket ETH Amount.” The number of ETH represented by a Share will gradually decrease over time as the Trust’s ETH are used to pay the Trust’s expenses. Each Share represented approximately 0.0950 ETH and 0.0974 ETH as of December 31, 2019 and 2018, respectively.
submitted by thamilton5 to streamr [link] [comments]
Streamr Network: Performance and Scalability Whitepaper
The Corea milestone of the Streamr Network went live in late 2019. Since then a few people in the team have been working on an academic whitepaper to describe its design principles, position it with respect to prior art, and prove certain properties it has. The paper is now ready, and it has been submitted to the IEEE Access journal for peer review. It is also now published on the new Papers section on the project website. In this blog, I’ll introduce the paper and explain its key results. All the figures presented in this post are from the paper.
The reasons for doing this research and writing this paper were simple: many prospective users of the Network, especially more serious ones such as enterprises, ask questions like ‘how does it scale?’, ‘why does it scale?’, ‘what is the latency in the network?’, and ‘how much bandwidth is consumed?’. While some answers could be provided before, the Network in its currently deployed form is still small-scale and can’t really show a track record of scalability for example, so there was clearly a need to produce some in-depth material about the structure of the Network and its performance at large, global scale. The paper answers these questions.
Another reason is that decentralized peer-to-peer networks have experienced a new renaissance due to the rise in blockchain networks. Peer-to-peer pub/sub networks were a hot research topic in the early 2000s, but not many real-world implementations were ever created. Today, most blockchain networks use methods from that era under the hood to disseminate block headers, transactions, and other events important for them to function. Other megatrends like IoT and social media are also creating demand for new kinds of scalable message transport layers.
The latency vs. bandwidth tradeoffThe current Streamr Network uses regular random graphs as stream topologies. ‘Regular’ here means that nodes connect to a fixed number of other nodes that publish or subscribe to the same stream, and ‘random’ means that those nodes are selected randomly.
Random connections can of course mean that absurd routes get formed occasionally, for example a data point might travel from Germany to France via the US. But random graphs have been studied extensively in the academic literature, and their properties are not nearly as bad as the above example sounds — such graphs are actually quite good! Data always takes multiple routes in the network, and only the fastest route counts. The less-than-optimal routes are there for redundancy, and redundancy is good, because it improves security and churn tolerance.
There is an important parameter called node degree, which is the fixed number of nodes to which each node in a topology connects. A higher node degree means more duplication and thus more bandwidth consumption for each node, but it also means that fast routes are more likely to form. It’s a tradeoff; better latency can be traded for worse bandwidth consumption. In the following section, we’ll go deeper into analyzing this relationship.
Network diameter scales logarithmicallyOne useful metric to estimate the behavior of latency is the network diameter, which is the number of hops on the shortest path between the most distant pair of nodes in the network (i.e. the “longest shortest path”. The below plot shows how the network diameter behaves depending on node degree and number of nodes.
We can see that the network diameter increases logarithmically (very slowly), and a higher node degree ‘flattens the curve’. This is a property of random regular graphs, and this is very good — growing from 10,000 nodes to 100,000 nodes only increases the diameter by a few hops! To analyse the effect of the node degree further, we can plot the maximum network diameter using various node degrees:
Network diameter in network of 100 000 nodes
We can see that there are diminishing returns for increasing the node degree. On the other hand, the penalty (number of duplicates, i.e. bandwidth consumption), increases linearly with node degree:
Number of duplicates received by the non-publisher nodes
In the Streamr Network, each stream forms its own separate overlay network and can even have a custom node degree. This allows the owner of the stream to configure their preferred latency/bandwidth balance (imagine such a slider control in the Streamr Core UI). However, finding a good default value is important. From this analysis, we can conclude that:
Latency scales logarithmicallyTo see if actual latency scales logarithmically in real-world conditions, we ran large numbers of nodes in 16 different Amazon AWS data centers around the world. We ran experiments with network sizes between 32 to 2048 nodes. Each node published messages to the network, and we measured how long it took for the other nodes to get the message. The experiment was repeated 10 times for each network size.
The below image displays one of the key results of the paper. It shows a CDF (cumulative distribution function) of the measured latencies across all experiments. The y-axis runs from 0 to 1, i.e. 0% to 100%.
CDF of message propagation delay
From this graph we can easily read things like: in a 32 nodes network (blue line), 50% of message deliveries happened within 150 ms globally, and all messages were delivered in around 250 ms. In the largest network of 2048 nodes (pink line), 99% of deliveries happened within 362 ms globally.
To put these results in context, PubNub, a centralized message brokering service, promises to deliver messages within 250 ms — and that’s a centralized service! Decentralization comes with unquestionable benefits (no vendor lock-in, no trust required, network effects, etc.), but if such protocols are inferior in terms of performance or cost, they won’t get adopted. It’s pretty safe to say that the Streamr Network is on par with centralized services even when it comes to latency, which is usually the Achilles’ heel of P2P networks (think of how slow blockchains are!). And the Network will only get better with time.
Then we tackled the big question: does the latency behave logarithmically?
Mean message propagation delay in Amazon experiments
Above, the thick line is the average latency for each network size. From the graph, we can see that the latency grows logarithmically as the network size increases, which means excellent scalability.
The shaded area shows the difference between the best and worst average latencies in each repeat. Here we can see the element of chance at play; due to the randomness in which nodes become neighbours, some topologies are faster than others. Given enough repeats, some near-optimal topologies can be found. The difference between average topologies and the best topologies gives us a glimpse of how much room for optimisation there is, i.e. with a smarter-than-random topology construction, how much improvement is possible (while still staying in the realm of regular graphs)? Out of the observed topologies, the difference between the average and the best observed topology is between 5–13%, so not that much. Other subclasses of graphs, such as irregular graphs, trees, and so on, can of course unlock more room for improvement, but they are different beasts and come with their own disadvantages too.
It’s also worth asking: how much worse is the measured latency compared to the fastest possible latency, i.e. that of a direct connection? While having direct connections between a publisher and subscribers is definitely not scalable, secure, or often even feasible due to firewalls, NATs and such, it’s still worth asking what the latency penalty of peer-to-peer is.
Relative delay penalty in Amazon experiments
As you can see, this plot has the same shape as the previous one, but the y-axis is different. Here, we are showing the relative delay penalty (RDP). It’s the latency in the peer-to-peer network (shown in the previous plot), divided by the latency of a direct connection measured with the ping tool. So a direct connection equals an RDP value of 1, and the measured RDP in the peer-to-peer network is roughly between 2 and 3 in the observed topologies. It increases logarithmically with network size, just like absolute latency.
Again, given that latency is the Achilles’ heel of decentralized systems, that’s not bad at all. It shows that such a network delivers acceptable performance for the vast majority of use cases, only excluding the most latency-sensitive ones, such as online gaming or arbitrage trading. For most other use cases, it doesn’t matter whether it takes 25 or 75 milliseconds to deliver a data point.
Latency is predictableIt’s useful for a messaging system to have consistent and predictable latency. Imagine for example a smart traffic system, where cars can alert each other about dangers on the road. It would be pretty bad if, even minutes after publishing it, some cars still haven’t received the warning. However, such delays easily occur in peer-to-peer networks. Everyone in the crypto space has seen first-hand how plenty of Bitcoin or Ethereum nodes lag even minutes behind the latest chain state.
So we wanted to see whether it would be possible to estimate the latencies in the peer-to-peer network if the topology and the latencies between connected pairs of nodes are known. We applied Dijkstra’s algorithm to compute estimates for average latencies from the input topology data, and compared the estimates to the actual measured average latencies:
Mean message propagation delay in Amazon experiments
We can see that, at least in these experiments, the estimates seemed to provide a lower bound for the actual values, and the average estimation error was 3.5%. The measured value is higher than the estimated one because the estimation only considers network delays, while in reality there is also a little bit of a processing delay at each node.
ConclusionThe research has shown that the Streamr Network can be expected to deliver messages in roughly 150–350 milliseconds worldwide, even at a large scale with thousands of nodes subscribing to a stream. This is on par with centralized message brokers today, showing that the decentralized and peer-to-peer approach is a viable alternative for all but the most latency-sensitive applications.
It’s thrilling to think that by accepting a latency only 2–3 times longer than the latency of an unscalable and insecure direct connecion, applications can interconnect over an open fabric with global scalability, no single point of failure, no vendor lock-in, and no need to trust anyone — all that becomes available out of the box.
In the real-time data space, there are plenty of other aspects to explore, which we didn’t cover in this paper. For example, we did not measure throughput characteristics of network topologies. Different streams are independent, so clearly there’s scalability in the number of streams, and heavy streams can be partitioned, allowing each stream to scale too. Throughput is mainly limited, therefore, by the hardware and network connection used by the network nodes involved in a topology. Measuring the maximum throughput would basically be measuring the hardware as well as the performance of our implemented code. While interesting, this is not a high priority research target at this point in time. And thanks to the redundancy in the network, individual slow nodes do not slow down the whole topology; the data will arrive via faster nodes instead.
Also out of scope for this paper is analysing the costs of running such a network, including the OPEX for publishers and node operators. This is a topic of ongoing research, which we’re currently doing as part of designing the token incentive mechanisms of the Streamr Network, due to be implemented in a later milestone.
I hope that this blog has provided some insight into the fascinating results the team uncovered during this research. For a more in-depth look at the context of this work, and more detail about the research, we invite you to read the full paper.
If you have an interest in network performance and scalability from a developer or enterprise perspective, we will be hosting a talk about this research in the coming weeks, so keep an eye out for more details on the Streamr social media channels. In the meantime, feedback and comments are welcome. Please add a comment to this Reddit thread or email [[email protected]](mailto:[email protected]).
Originally published by. Henri at blog.streamr.network on August 24, 2020.
Getting a bank loan can sometimes seem like an impossible task. There is a credit history to check, collateral to provide, and other complex little details. Now, in times of the global financial crisis, banks will likely be even more cautious about giving out money.submitted by QDAODeFi to u/QDAODeFi [link] [comments]
This is why DeFi-based crypto loans are booming. The sector is still relatively new – it emerged around two and a half years ago. And since then, it has experienced rapid growth. Take a look at this graph demonstrating the cumulative lending to institutions by Genesis Lending Originations – one of the major OTC-platforms.
By the end of the first quarter of 2020, the company gave out loans worth $1 billion. And that’s just Genesis Lending. Compound, the current leader among DeFi platforms currently has over $883 million in outstanding debt.
QDAO DeFi is launching a crypto loan feature as well. Users can deposit their crypto, borrow fiat and take their crypto assets back after paying the small interest rate – simple as that. The service is available to everyone with internet access and provides outstanding benefits to users.
We will demonstrate use cases for three different types of borrowers: private persons, traders and institutional investors. Most likely you’ll recognize at least one of these situations.
Disclaimer: the interest rates stated below serve only as examples and are subject to change. Always check the current rates on the QDAO DeFi Loans page.
Example 1. The borrower is a private person
This is the most basic use case in which the DeFi platform functions similar to a pawnshop.
Let’s assume that a person has already bought or is planning to buy 1 Bitcoin and is committed to holding on to it. The new rally is coming soon and this person is getting ready to cash out. However, due to some circumstances, they need fiat money now, preferably as fast as possible.
This person could sell Bitcoin. But if they don’t buy it back soon, they might lose the opportunity for profit.
Now, there is no need to choose between taking care of urgent issues and keeping the asset. Anyone can simply use crypto as collateral.
Let’s say that at the moment, Bitcoin costs $9,000. If the borrower puts it in QDAO DeFi, they can get up to 50% of crptocurrency’s value in fiat, i.e. $4,500. There is no credit check or extensive paperwork involved and all transactions are practically instant.
The borrower uses the cash however they see fit. Meanwhile, their collateral remains untouched in Custody Storage. The monthly interest is, for example, 2%. So if this person wants to get his Bitcoin back in three months, they will only need to pay $4,770. By this time, the price of their Bitcoin might have gone well above $10,000, so the deal is quite lucrative.
Example 2. The borrower is a crypto-trader
Let’s look at a serious crypto-trader who has 10 Bitcoins. The market is leaving the consolidation phase and is preparing for a rally.
They are serious about committing to a ‘HODL’ strategy and want to keep their Bitcoins. At the same time, this trader continues to make deals on exchanges to enhance future profits. After studying Ripple, they decide that it would make a nice addition to their portfolio and want to buy it. But this trader doesn’t have disposable funds to acquire enough XRP.
Since the Bitcoins are just sitting in a wallet, these assets can be used to acquire fiat resources. If the trader deposits all their Bitcoins (worth $90,000), they will get a significant boost of up to $45,000. And the best part is that the loan is in USDT, so it’s possible to just swap Tether for Ripple without extra transactions.
Let’s assume that the trader sells XRP in four months. They use part of the profit to cover the assumed interest of 2% ($3,600) then take the rest for themself and retrieve their Bitcoins.
Example 3. The borrower is an Institutional investor
There is a crypto fund or an asset management company with high net worth, long-term clients. This agency has a sizable portfolio that includes major stocks like Amazon and Apple as well as crypto assets like Bitcoin, Ripple and Ethereum.
10,000 Bitcoins are under the management of this company – so the combined asset worth is $90,000,000. According to the latest market analysis, BTC will stand at about $20,000 in 2021 and by 2025, it will stand at about $100,000. So obviously the agency should hold on to Bitcoin and the clients agree.
However, there is a very big IPO coming soon. The company has a unique opportunity to buy very promising stock. In several years, this investment will result in a very profitable exit.
But the current budget limits the volume of the possible investment. The agency is going to get more funds in two months, but the opportunity will be gone by then.
Since the company is going to be holding Bitcoin for some time anyway, it decides to use it as collateral. By depositing the assets in QDAO DeFi, it will receive a loan of up to $45,000,000. Investors use the money to buy shares. In two months, the company pays out the assumed interest ($1,800,000) and reclaims the Bitcoin.
The clients and the management are happy, because now the agency has even more great assets in the portfolio.
Example 4. The borrower does not repay the loan
Someone follows the usual process – puts 3 Bitcoins into QDAO DeFi and takes the $13,500 fiat loan.
For whatever reason, this person does not repay the debt for 7 months while the interest keeps growing. There is a sudden volatility spike and the Bitcoin price goes down to $5000 for a time, hitting the margin call mark. This is less than the original loan, so the borrower has no reason to repay it and cover the interest, since that would mean a complete loss for them. There is also no reason for QDAO DeFi to store this person’s Bitcoin.
The borrower is notified that the assets are now the property of QDAO DeFi and the loan is closed. No further legal action is taken and no penalties have been incurred for defaulting on the loan.
The ability to choose almost all the terms of a cash loan with QDAO DeFi provides unlimited opportunities to get money for any purpose. By planning ahead, all borrowers can get additional fiat assets to multiply their profits while still retaining everything they had before taking out the loan.
Want to be the first to hear QDAO DeFi news and updates? Visit our website and stay in touch with us on social media: Twitter, Facebook, Telegram and LINE (for the Japanese-speaking community).
- MIP1 (Maker Governance Paradigms)- MIP2 (Launch Period)- MIP3 (Governance Cycle)- MIP4 (MIP Amendment and Removal Process)- MIP5 (Emergency Voting System)- MIP6 (Collateral Onboarding Form/Forum Template)- MIP7 (Onboarding and Offboarding Domain Teams for Collateral Onboarding)- MIP8 (Domain Greenlight)- MIP9 (Community Greenlight)- MIP10 (Oracle Management)- MIP11 (Collateral Onboarding General Risk Model Management)- MIP12 (Collateral and Risk Parameter Management)By and large, the MIPs codify many of the informal Maker governance processes. There is currently a request for comments period (MIP forum) and there will be an informal poll on Monday, April 27 on whether to proceed with the 13 MIPs and 2 sub proposals. If it’s a “Yes”, than an executive for an official ratification vote would start on May 1 and lasts for 4 days. If it passes, the official governance cycle will begin and the rest of the MIPs will likely be approved from May 4 – 6.
submitted by Crypto_Browser to CryptoBrowser_EN [link] [comments]
Ren Is The Second Most Popular Method For Bitcoin Tokenization
Shortly after the news that $449M worth of Bitcoin are held on the Ethereum blockchain, one of the inter-blockchain liquidity leaders – RenBTC, marked another milestone. According to data, RenBTC, which is the second most popular method of Bitcoin tokenization, reached 10,000 BTC locked on Ethereum's network (see graph below).
The data is from August 17, but at the time of writing, statistic shows that there was a drop, with current supply of locked tokens on the ETH network being 8,329 BTC. Total current market value is equal to $98,270,860.
However, the gap between RenBTC and the current leader in tokenizing Bitcoin for use with DeFi protocols, WrappedBTC (wBTC) is still very significant. Looking at current data, wBTC is at 29,063 BTCs on the Ethereum network, with market value of $342,912,825 as of today.
In just one week, RenVM’s total value locked (TVL) indicator surged over two times to reach a value of $175 million, according to data from Defipulse. Also, RenVM’s native token – REN – also saw a massive boost in its price, gaining 245% since the start of August. REN started the month at $0.16065 and reached an all-time high of $0.566408 just 18 days later with exponential growth in trading volumes. Currently, it is the 35th-largest cryptocurrency аccording to CoinMarketCap, trading at $0.529359.
Source: Defi Pulse
The recent REN price surge indicates that more investors are researching the option to put their BTC possessions into more profitable projects. As Cryptobrowser.io reported this Monday, Bitcoin tokenization platforms combined together were responsible for over 38,000 BTC locked on Ethereum’s blockchain. The total amount as of today is now 43,354 BTC, which is valued at over $513 million.
The exponential increase in Bitcoin tokenization created a rivalry between Bitcoin maximalists and Ethereum supporters about the total amount of tokenized BTC on Ethereum’s network. Blockstream’s CSO Samson Mow really sparked a flame on a podcast show, hosted by Peter McCormack. In the podcast, featuring the co-founder of Ethereum, Vitalik Buterin, Mow stated that tokenized Bitcoin only proves that the Ethereum ecosystem needs Bitcoin.
“The whole reason that Bitcoin is staying wrapped to be used on Ethereum is that it is stable and reliable. That's why it seems to be Ethereans prefer using wrapped Bitcoin to Ether to do their DeFi stuff.”, Mow added.
Mow’s stance received a wave of negativism, with YouTube channels like Bankless calling Mow’s statement a “blatant lie”. In a response video, the hosts behind Bankless noted that Bitcoin struggles with the same scaling and transaction fees problems as Ethereum, and the only way to tackle the scaling issue for Bitcoin is to be tokenized on Ethereum’s blockchain. Many Ethereum fans also critiqued Samson Mow and reportedly turned off the podcast.
https://preview.redd.it/8yehv8lzsce51.jpg?width=960&format=pjpg&auto=webp&s=69f0a6eb4973a5a9974e42d15709434719a26a81submitted by wazzocklegless to u/wazzocklegless [link] [comments]
When Chris Giancarlo was the chairman of the Commodity Futures Trading Commission he became a rock-star of sorts in certain corners of the cryptocurrency community, helping establish criteria that eventually led to bitcoin and ethereum being declared commodities, more like coffee or sugar than stock in a company. The U.S. Securities and Exchange Commission largely followed suit, eventually also declaring that bitcoin and ether, the cryptocurrency powering the ethereum blockchain weren’t securities.
Now chairman emeritus Giancarlo, who was deemed “Crypto Dad” following an impassioned speech he gave to Congress where he credited bitcoin for finally getting his kids interested in finance, is at it again, having co-written a detailed argument published this morning in the International Financial Law Review for why XRP, the cryptocurrency formally known as “ripples,” was also not a security. The only problem is he’s no longer a regulator. In fact, his employer is on the payroll of Ripple, the largest single owner of XRP, whose co-founders actually created the cryptocurrency.
The bombshell paper, titled, “Cryptocurrencies and U.S. Securities Laws: Beyond Bitcoin and Ether,” co-authored by commodities lawyer Conrad Bahlke of New York law firm Willkie Farr & Gallagher LLP, methodically reviews the criteria of the Howey Test, established by the SEC in 1946 to determine whether something is a security, and point-by-point argues that XRP does not qualify. Rather, the paper argues, like its name would indicate, cryptocurrency is a currency of perhaps more interest to the Federal Reserve and central banks than securities regulators.
What’s at stake here to the cryptocurrency world cannot be overestimated. XRP is now the fourth largest cryptocurrency by market cap, with $5.9 billion worth of the asset in circulation according to cryptocurrency data site Messari. While Ripple was valued at $10 billion according to its most recent round of funding, the company continues to fund itself in part by selling its deep war chest of 55.6 billion XRP, coincidentally valued at the same amount as the company itself.
Not only could an eventual decision by the SEC to classify—or not classify—XRP as a security impact the untold individual owners of the cryptocurrency, but other clients using Ripple services that don’t rely on the cryptocurrency, including American Express, Santander, and SBI Holdings could stand to be impacted positively or negatively depending on the decision. After all if XRP were to be rescinded it would be a huge cost to their software provider. If Giancarlo is right though, Ripple could end up being one of the most valuable startups in fintech.
“Ultimately, under a fair application of the Howey test and the SEC’s presently expanding analysis, XRP should not be regulated as a security, but instead considered a currency or a medium of exchange,” Giancarlo and Bahlke argue in the paper. “The increased adoption of XRP as a medium of exchange and a form of payment in recent years, both by consumers and in the business-to-business setting, further underscores the utility of XRP as a bona fide fiat substitute.”
Giancarlo was nominated to be a commissioner of the CFTC by then-President Barack Obama in 2013. In 2015, he helped lead the thinking behind the CFTC’s decision that bitcoin and other cryptocurrencies were commodities, paving the way for the SEC’s related comments that neither bitcoin nor ethereum are securities. Then, at the height of the 2017 cryptocurrency bubble President Trump nominated him to be Chairman of the CFTC, where he oversaw the creation of a number of bitcoin futures projects, including at CME Group and the short-lived effort at Cboe.
While many blame the creation of bitcoin futures for popping the 2017 price bubble, which almost hit $20,000 before halving today, others have seen the works as a fundamental process of maturity, helping pave the way for more sophisticated crypto-enabled financial offerings. Giancarlo’s last day in office at the CFTC was in 2019, after which he promptly got involved helping envision the future of assets issued on a blockchain. In November he joined as an advisor to American Financial Exchange, using ethereum to create a Libor alternative. The following January he co-founded the Digital Dollar Project leading the push to use blockchain at the Federal Reserve and now it would seem he’s hoping to influence the classification of XRP as he did for bitcoin and ethereum, but from the other side of regulation.
Importantly however, a footnote in the report discloses that not only is Giancarlo and Bahlke’s firm, Willkie Farr & Gallagher LLP counsel to Ripple Labs, but they “relied on certain factual information provided by Ripple in the preparation of this article.” While it’s impossible to parse what information came from the co-authors and what came from Ripple, the resulting legal argument is fascinating, even if it does leave room for doubt.
The Howey test Giancarlo uses to bolster his arguments is a three-pronged definition used by the SEC, none of which he says apply to XRP. The first prong, is that an investment contract should be implied or explicitly stated between the issuer of the asset, in this case XRP and the owner, in which money exchanges hands. “The mere fact that an individual holds XRP does not create any relationship, rights or privileges with respect to Ripple any more than owning Ether would create a contract with the Ethereum Foundation, the organization that oversees the Ethereum architecture,” he writes.
This does however overlook the fact that OpenCoin, credited on Ripple’s own site in 2013 for creating XRP (then tellingly described as “ripples”), was run by many of the same people that founded Ripple. The original creators of XRP then donated the vast majority of the assets to Ripple, which they also ran, creating a sense of distance, tacit though it may be. The actual data around the creation of XRP was also muddled by a glitch in the code that means unlike bitcoin and ethereum the crucial genesis data is no longer attached to the rest of the ledger. The rebranding of “ripples” as XRP further extended the sense of distance between XRP and Ripple, followed by an aggressive campaign to get media to stop describing the cryptocurrency as “Ripple’s XRP.”
With so much distance between the company that actually created XRP and the company that now owns more than half of it, one would be forgiven for wondering, if there was an implied contract between OpenCoin and XRP owners, does the donation from one group of people at one company to a very similar group of people at another company sever that responsibility? In spite of the sense of distance created by Ripple between itself and the cryptocurrency its co-founders created, a number of active lawsuits alleging securities violations have been filed. In all fairness though, Giancarlo appears to recognize this prong may not be Ripple’s strongest defense and concludes the section, hedging: “Even if XRP were to satisfy one or two of the “prongs” of the Howey test, it does not satisfy all three factors such that XRP is an investment contract subject to regulation as a security.”
The second prong of the Howey test stipulates that there can be no “common enterprise” between shareholders or a shareholder and the company. While refuting both relationships, Giancarlo curiously goes onto to write that “given the juxtaposition between XRP’s intended use as a liquidity tool, its more general use to transfer value and its potential as a speculative asset, XRP holders who utilize the coins for different purposes have divergent interests with respect to XRP.”
Ironically, there has always been a widely held belief that owning a cryptocurrency would unify interests around a single goal: to co-create the infrastructure that lets the cryptocurrency exist and ensure it was vibrant and diverse. Meanwhile, XRP, in spite of its aggressive supporters on social media, is one of the least diverse ecosystems, with the vast majority of serious development being done within Ripple. If XRP owners aren’t expecting an increase in value from the work being done by Ripple, they certainly aren’t nearly as involved in helping build that future as are owners of bitcoin and ethereum.
In a related issue, the third prong of the Howey test stipulates that “no reasonable expectation of profit should be derived from the efforts of Ripple,” according to the paper. Supporting this position, Giancarlo writes: “Though Ripple maintains a sizable stake of the XRP supply and certainly has a pecuniary interest in the value of its holdings, it is not enough to suggest that a mutual interest in the value of an asset gives rise to an expectation of profits as contemplated by Howey.” Again, this strains credulity.
According to its own site, Ripple currently has access to 6.4% of all the XRP ever created. But that doesn’t count the 49.2% of the total XRP Ripple owns, but is locked in a series of escrow accounts that become periodically available to Ripple and Ripple alone. Adding those two percentages together leaves a float of only about 44% of XRP that has been distributed for public ownership. For some comparison, Facebook went public the same year XRP was created and has a 99% float, according to FactSet data, meaning almost all of its stock is in the hands of traders.While Ripple does also have more traditional stock, this distribution shows that Ripple might not be as distributed as it claims.
While it’s perhaps no surprise that Giancarlo would come out on the side of his own client, there’s also plenty of other reasons to believe his argument may in fact hold water. In February 2018, the notoriously compliant exchange Coinbase added support for XRP, something it would unlikely do if it were concerned it might accidentally be selling an unlicensed security. Perhaps most tellingly though, Ripple has also been granted a difficult-to-obtain BitLicense from the New York Department of Financial Services, giving it the blessing of a respected regulator. However, while the license was granted after then-superintendent Benjamin Lawsky stepped down from the regulator, it's perhaps no coincidence that a year later he joined Ripple on its board of directors and is now active in the cryptocurrency space. Perhaps a similar fate is in store for Giancarlo.
Editor’s note: This article has been updated to clarify that Ripple Labs is a client of Giancarlo’s law firm.
On the price chart there is shown historical value of BTC cryptocurrency, log graph of Bitcoin market capitalization and the most reasonable historical dates. Bitcoin in 2008 . History of Bitcoin price in 2008, 2009, 2010. On 18 August 2008, the domain name bitcoin.org was registered. Later that year on October 31st, a link to a paper authored by Satoshi Nakamoto titled Bitcoin: A Peer-to-Peer ... Bitcoin USD price, real-time (live) charts, bitcoin news and videos. Learn about BTC value, bitcoin cryptocurrency, crypto trading, and more. Bitcoin BTC price graph info 24 hours, 7 day, 1 month, 3 month, 6 month, 1 year. Prices denoted in BTC, USD, EUR, CNY, RUR, GBP. Bitcoin Price (BTC). Price chart, trade volume, market cap, and more. Discover new cryptocurrencies to add to your portfolio. Skip to content. Prices. Products. Company. Earn crypto. Get $171+ Sign in. Get started. Price charts Bitcoin price. Bitcoin price (BTC) Add to Watchlist $ 13,070.01 +0.56%. 1h. 24h. 1w. 1m. 1y. all. $0.0000 January 1 12:00 AM. 10:56 AM 3:06 PM 7:17 PM 11:27 PM 3:38 AM ... In June 2016, the value of Bitcoin exceeded $761, in March 2017 it updated the previous historical maximum and rose to $1,274 for the first time. On December 17, 2017, Bitcoin was trading at $ 19,974. If you check BTC worth graph, you will understand that this record hasn’t been broken so far.
[index]          
Incredible BITCOIN Price Movement 2009 to 2017 ..what will be his next step...? More Information https://www.cryptooos.com/ Buy Bitcoin(Lambo) here https://c... Hey guys, today the video is about how the Bitcoin price is calculated. It took me a while to understand this when I first got into Bitcoin - I was confused ... Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube. This video shows the historical chart of Bitcoin/USD starting year 2010 to 2019. Source: https://info.binance.com/en/currencies/bitcoin Please subscribe for ... This is Bitcoin chart from 2013-2018. This feature is not available right now. Please try again later.