Warren Buffett blasts bitcoin as worthless and vows he ...

An attepmt at explaining DeFi (this week...)

Warning, long post from my mornings contemplation. See https://twitter.com/markjeffrey/status/1300175793352445952 (Mark Jeffery 30 mins) for a video explaining DeFi.
This is my attempt at explaining DeFi.
I’m still learning this stuff, so any corrections are welcomed.
Links are provided for information, none are recommendations, nor referral links. Do your own research (DYOR) before investing :)
I’ll try not to shill YFI too much...
Not all platforms use the same mechanics as I describe, but I think I’ve covered the most common ones.

Stable coins
Crypro currency that is intended to maintain a level value. Normally with respect to USD $. Some rely on a trusted third party who has actual USD sitting in a bank account (USDT aka Tether, USDC…), others are trustless (DAI)

Maker
Lock collateral into the smart contract. Then DIA can be generated, and used for other things. DAI is designed to match the USD, and is completely trustless. You must have more value staked than the DAI removed (at least 150% over collateral) or you will get liquidated.

BTC on ETH
Bitcoin can not be directly used on the etherium chain. So, there are a number ways to make the value availble. Most involve trusting a 3rd party and the most common is wrapped BTC wBTC.
Notes WETH (Wrapped ETH) is used by some contracts to use ETH (direct use of ETH is not possible in some contracts) Unlinke WBTC, WETH is trustless as evrythign is done on the etherium blockchain (I think).

Lending
You deposit a valuable token onto a pool on platform, someone else borrows it. They pay interest to the pool. You get a proportion of the pools interest over time. When there is high demand for a particular token, the interest rate increases dynamically.
e.g. look at the interest rate model and click on the figure for
https://compound.finance/markets/USDC
Borrow rates increase lineally as more of the available pool is loaned. 2% at zero and 12.5% when the pool is emptied.
Earnings are lower than the borrowing rates because: There is more in the pool than borrowed. The platform takes a cut.
e.g. 50% of the pool is borrowed, the borrower pays 7.25%, but the lenders only get 3.38%. 3.38/0.5 = 6.76%, so about 0.5% of the interest is being taken by compound.
Different pools have different interest rate functions, DAI has an inflection point to maintain a buffer https://compound.finance/markets/DAI
The interest rate increases slowly to 4% until 75% of the available pool is loaned out. Then it’s much more expensive to borrow e.g. 16% APR at 90% utilisation.
When lending a single token into a single pool, you should always get the (slightly ?) more of same token back.

How lending works
You deposit ETH, you are given a token back as proof of participation in the pool (cETH for comound.finance).
The exchange rate for cETH to ETH is NOT fixed. Rather is changes over time. As the ETH interest is paid into the pool the cETH becomes more valuable compared to the initial deposit.
e.g. you deposit 10 ETH, and get 499.52 cETH. In a months time, you repay the 499.2 cETH cETH and get 10.1 ETH back. You have just gained 1%.

Taxes
In many jurisdictions, converting ETH to cETH would be classed as a taxable event (DYOR ! )

Lego Bricks
The cETH represents your ETH, so it has value. This means it can be used for other things...
Lego bricks is taken to mean that all these things fit together and you can sue them in different ways.

How borrowing works
You need to be over colarteralised to borrow from most platforms. So, if you deposit 10.0 ETH into a smart contract, you (currently) have $4,000 of collateral to work with. The platform may then let you borrow a % of your collateral in other tokens.
So, you can borrow $2,000 of USDC, to buy more 5 ETH. Then when ETH price goes up you sell $2100 back to USDC and repay the interest. Now you have 10.x ETH.
This is a form of Leverage, when the price goes up, you win. However, if the ETH price goes down, you risk being Liquidated. This means part of your collateral will be sold at the (lower) market price to repay your loan. There will likely be a penalty for you. (e.g. @ ETH = $300, 7.33 of your ETH is sold for $2,400, your USDC loan is repaid, and you keep the remaining 2.67 ETH and the 5 ETH you purchased.

Shorting
Deposit $8,000 collateral, Borrow 10 ETH and sell for $400 each. If the price drops to $380, buy 10.1 ETH and repay the loan and interest. You have just made $162 profit. However, if the price goes up you will still need to buy 10.1 ETH.

Flash Loans
A technomage creates a single transaction that borrows lots of money. Then within the same single ~13 second block uses it to do lots of complex things to hopefully make a profit. As it’s all within a single block, collateral is not required.
See https://mobile.twitter.com/nanexcool/status/1297068546023993349 for a transaction that made ~46,000 USDC profit (without collateral)
If this post is introducing you to the possibilities of flash loans, you are very unlikely to ever do one in the near future.
I think Aave is the most common source for flash loans.

Simple farming lending:
Simply put you token in which ever platform offers the largest interest rate. Moving to the best option costs gas (and attention).

Complex lending farming
Some platforms offer tokens in return for using a platform, so simple APR comparisons aren’t sufficient. If the additional platform token has high value it can distort the market.
E.g. when COMP was initially offered, it was profitable to:

  1. Place collateral on compound.finance
  2. Borrow BAT at 30%
  3. Lend the BAT back to the same platform at 15%
  4. Collect the COMP accrued due to interest paid and interest earned.
  5. Sell the COMP on the open market.
This technique was made less favourable by compound changing the distribution model so smaller pools (like BAT) couldn’t be exploited in this way.

DEX
Decentralised exchanges range from ones that operate with depositing assets, trading with an order book and then withdrawing, to simple interfaces that allow you to swap tokens. of the latter, the most popular is uniswap.

Liquidity provision
The swap based DEX’s rely on liquidity providers (LP). Here you deposit equal values of two tokens e.g. USDC and ETH.
Then any time someone wants to swap USDC for ETH on the exchange, they add USDC and remove ETH from the pool.
Each time someone does a swap, they pay a fee to the liquidity pool and you get a share.

Impairment loss
However, if the price of one asset goes up, the pool with stabilise to have less of it. So you see an overall increase, but not as much as if you had just hold’ed.
See https://twitter.com/ChainLinkGod/status/1270046868932661248 for an example.
Hopefully, the fees accrued are greater than the losses.
https://twitter.com/Tetranode/status/1300326676451057664/photo/1

Stable coin pairs
If you restrict yourself to similar things (e.g. USD stable coins, or different versions of BTC on Ethereum), then the impairment loss is much reduced. Curve.finance focuses on such like for like pools and allows multiple tokens in a single pool.

Complex farming liquidity pools
Taking advantage of governance token rewards for using certain exchanges / pools. This can be done to boot strap liquidity and / or allow a decentralisation of the governance of the DEX.
The tokes received have value because of expected future income, or governance rights (which may be exploited for future income)

Yearn
Yearn is a group of smart farmer protocols that allow pooling to reduce gas costs and benefit from smart developers / contracts.
The simplest EARN take tokens / stable coins and place them in the highest yielding platform for that token. https://yearn.finance/earn
The yCRV vault provides USD stable coin liquidity within curve for trading fees, but also lending fees via Yearn pools for each stable coin (oh and it gets CRV governance tokens…).
Other vaults use more complex strategies. The collateral is used to generate stable coins that then generate income from interest rates, Liquidity provision fees, and accrual of governance tokens. Some governance tokens are sold, others are used to optimise the rewards from other platforms.
For example, see this video on the Link Vault (Mark Jeffrey 13 mins).
https://twitter.com/markjeffrey/status/1300175793352445952
I expect the ETH vault may be similar, but may include Maker to generate the stable coins (rather than borrowing on Aave).
This video is a good intro on curve / yearn products (DeFIDad 31 mins) https://www.youtube.com/watch?v=yP-4pJpKbRU
All of these steps can be done by yourself, however, gas costs would be significant unless you have a large amount invested. Yearn, and vaults pay fees to the YFI protocol.

YFI
YFI is the token for yearn. There are only 30,000 issued. So, you can not earn them, you can:
1) Stake them for governance rewards
2) place in a yYFI vauly to gain more FYI
3) Use them as long term Ventrue capital funds within a DAO (coming soon (tm) ).

YFII, YVFV etc.
Forks of the YFI with different tokens / fees.

YAM, Sushi, YFII, etc.
To be completed…

Synthetix
To be completed...

Finally:
This is not financial advice.
There are multiple risks which get larger as more moving parts are added.
Errors and omissions expected.
Do you own research.
Comments and corrections welcomed
submitted by Over-analyser to ethfinance [link] [comments]

FI on my birthday!

I did it today. On my 47th birthday, I reached $3 million in cash & investments, a paid off house & 2 cars. I have reached my FI target of $100k @ 3.3% SWR. I knew it was going to be close with the markets, payday, and my company's equity coming in today.
Plan is to FIRE in 3 months. $3 million was a symbolic number, I could have FIREd 2 months ago at $2.95 million and lived pretty much the same life. However, I am getting another ~$70k in equity in 3 months and would like a bit of a buffer especially with the volatile markets. Also, the plan was to take a nice trip to Europe in August - I don't see that happening.
It is crazy, I know of many people who are laid off, working reduced hours, worried about their job or tapping into debt. And I am making plans to quit working.

Mega edit:
Asset allocation
Cash & short term investments: 25% (increased 10% from equities due to C19)
Employer's Equity: 10%
Equity ETFs: 45% (down 10% - sold in early march, will buy back in later)
Bond ETFs: 10%
Crypto: 10%
Please bring on the flames for timing the market, but I sold early-ish, it helps me sleep at night, and right now I am trying to be more conservative vs. aggressive.
The crypto is a flyer. I bought casino level bitcoin in 2012 at $18, then sold a bunch when it went up to $150. Then bought a bunch more at $1000, and have been selling little bits for a few years. Total investment: $45k, total value sold and still held: $500k. I would like to sell more, but it has a capital gains tax liability, so FIRE with no income next year would help reduce taxes.

About my journey
Grew up middle class. Money was tight from time to time, but I never really saw that.
Part time job when I was 16 for spending cash. Never went into debt. Saved a little.
Went into a good college for STEM, received $8000 total for tuition from family and received a student loan for $1000. Did a few paid internships while in college. Paid off the loan with my first post-school paycheck.
Graduated in the tech industry in 97. Started full time at the last internship. $40k base. Stayed there for 2 more years, increased base by 20%. $48k base. Left (co-worker left and pulled me), for a ~100% increase. $90k base. Stayed there 6 months (dying ship), left for ~10% increase. $100k base. Stayed there for 2 years (all of my managers up to CEO left in 2 weeks), left for a 0% change. $100k base. Stayed there for 1.5 years, was let go, started consulting at +50% (but no benefits). $150k consulting. Consulted for 8 months, left when project was wrapping up for -40% (+10% from previous full time) $110k base. Stayed there for 11 years (company was acquired 3 after years), increased pay by ~20% in 11 years. $133k base. Left for 5% increase. $141k base. Worked there for 6 months (bad fit), left for 5% increase. $150k base. Here now. Making about 4x after 20 years. Did not include any bonus (or not), benefits (health/retirement/etc...) stock options, quality of life, etc..

Current investments
Canadian ETF / funds I invest in in decreasing amounts.
XGRO VCN VXC VAB VT TDB900 TDB902 TDB909 TDB911 Tangerine Balanced (part of emergency fund) TDB661 CDZ

submitted by throwaway_canada1 to financialindependence [link] [comments]

Zhuoer Jiang: Talk about the difference between BTC, BCH and BSV

Zhuoer Jiang: Talk about the difference between BTC, BCH and BSV

https://preview.redd.it/kcdq7qrjnyd51.jpg?width=570&format=pjpg&auto=webp&s=af67bd46683fbe3ffa6c081d490d69598dd83bbb
1. When do you contact Bitcoin? What do you think of the blockchain industry?
I came into contact with Bitcoin in October 2013. At that time, I was making game aids, which involved the issue of collecting money from Taiwan and Southeast Asia.
The reason for cultivating the blockchain industry is that blockchain is the only industry that can provide economic freedom. The blockchain is decentralized and has no control center, so no one can eliminate it, so it provides economic freedom. The counter-example is Qvod player. Although Qvod player also has tens of thousands of nodes, it is centralized. As long as the control center is killed, the Qvod player network will die. (QvodPlayer is a Chinese-based video-on-demand playback software, using P2P technology, users can watch online film and television programs through buffering. In mainland China, QvodPlayer has a huge number of users. Due to the use of a dedicated transmission protocol, QvodPlayer is used by some users to download banned videos, such as violent or pornographic videos, and politically sensitive videos. In addition, pirated movies are rampant in QvodPlayer)
I am not a Bitcoinist. As long as other tokens provide economic freedom, I will buy them with real money. My position portfolio is BCH 40% + BTC 30% + ETH 20% + economically free innovative currency 10%, and I think that ETH is likely to exceed BTC in total market value in this bull market.
2. What is the difference between BTC, BCH and BSV?
The easiest thing to see is the difference in block size. BTC blocks have been locked at about 1MB, while BSV advocates infinite blocks. BCH advocates a moderate block size, which cannot exceed the carrying capacity of an ordinary computer. The current value is about 32MB.
Both BTC and BSV have gone to extremes. The BTC development team, Core, pursues extreme decentralization, resulting in too small blocks and high transaction fees. In the last bull market, a transaction fee was as high as hundreds of thousands of yuan, which caused a large number of BTC users to flow out to BCH, ETH and other tokens.
Some people think that BTC can rely entirely on stored-value users instead of using users to survive. This is impossible. If there are no users, there are no stored value users. For example, gold is obviously more suitable for storing value, but almost everyone has bank deposits, except for the elderly, almost no one uses gold to store value. To
People usually use paper money to store value, and naturally they also use paper money to store value. It is impossible to use paper money to store value with gold, and it is impossible to use paper money for small transactions and gold for large transactions. Currency has a scale effect, and it must be a winner takes all.
BSV has gone to the other extreme. The blockchain is enough to store transaction data, but if the blockchain is used as cloud storage, no amount of space is enough. Think about how many resources the world has to store. The result is that the performance requirements are too high, the number of nodes is drastically reduced, and the foundation of the blockchain, which is decentralization, is lost. In the end, it falls into the same fate as the Qvod player. To
Behind the different block sizes are the differences in the spirit of the three. Just like during the Opium War, the difference between Britain and China's Qing Dynasty was not a superficial weapon, but a complete political, economic, and technological gap behind it.
Both BTC and BSV are irrational and religious to a certain extent. BTC advocates a deadlock block size, and BSV advocates a deadlock protocol. The two are very similar.
In terms of rational development and serving users, BCH has won. For example, the issuance of tokens is an important function and rigid demand of the blockchain. Tokens can already be issued on BCH through several protocols such as Wormhole and SLP, while BTC and BSV cannot yet. This is a huge difference in development.
3. Under what circumstances can BCH exceed BTC?
BCH has to wait for users to slowly develop until the number of users and transactions exceed BTC. Although under normal circumstances, the currency has a scale effect, this situation is unlikely to happen, but BTC made a fatal mistake, and locked the block and locked the user.
What if BTC expands like BCH?
First of all, BTC cannot be expanded because the expansion requires a hard fork, regardless of whether it is within the community or the Core, it must adhere to 1MB, insist on extreme decentralization, and BTC must be able to run on the Raspberry Pi. The result is that the expansion advocates in BTC and Core re-hard fork.
Isn't this the plot of the hard fork of BCH from BTC in 2017? So what are these "advocates" doing hard forking again? Just go straight to BCH.
Therefore, BTC must undergo a hard fork to expand, so it cannot be expanded.
So BCH only needs to catch up, which is a fixed goal. I estimate that in this bull market, BCH can exceed the number of users. At that time, BCH had a solid foundation of users and communities. The price increase only increases the price of BCH, the value of BCH is determined by the number of users, and the price fluctuates around the value.
4. Will BCH hard fork happen? What impact will it have on us later?
The BCH community has recently had a lot of discussions on the issue of miner donations, which reflects the decentralization of BCH.
If BCH is controlled by bitmain, why it took a long time for bitmain to implement this problem? Conversely, if CSW wants to modify something on BSV, it can be passed immediately.
5. Do you think BCH is worth long-term ownership?
I often say: "Ask God in the short term, and the number of users in the long term."
The longer the time, the more worth holding BCH. BCH is developing rapidly due to the correct route. I just gave an example. There are already several schemes for issuing tokens on BCH, but neither BTC nor BSV have one. Part of it is because BSV locks the protocol and is not convenient for development. The other part It is because the BSV community has inherited the characteristics of CSW and only speaks big words and does not do practical things.
Therefore, it is definitely worth holding for 1 to 2 years, and the rate of increase is likely to be higher than that of BTC. I predict that the highest point of this round of bull market for BCH will rise from about 3.6% of BTC to 10% to 20% of BTC.
8. Free Q&A
"Will Bitcoin die due to quantum computers or other reasons?"
Certainly not, at best, replace a quantum-resistant algorithm. Looking at it now, quantum computers will not be practical for a long time. And I think quantum computers may not be able to solve the NP problem, that is, the current asymmetric encryption problem, which may not be possible mathematically.
"The impact of the proliferation of contract transactions on currency prices?"
The currency price is ultimately determined by the number of users, not by speculative users. The proliferation of futures trading has happened long ago. From 2016 to 2017, in the presence of a large number of futures trading, BTC rose 100 times.
"Will you be notified when you escape?"
I will definitely not inform. I have already made predictions. I think the bull market may end in the second half of 2021. Or conversely, this bull market may last for two to three years, and two years are more likely.
Why not notify? Most of my clients are miners, and the currency price directly affects the income of the miners. If the currency price drops due to my notification, the interests of my clients will be damaged.
"Recommended regular investment in 2019, what strategy is recommended in 2020?"
This year's bull market has begun, and it must be a full position investment. The cost of regular investment to buy coins later is very high.
"Is it better to speculate or to mine now?"
Most people can't insist on holding the token from start to finish. Most people are in the middle of the bull market, or even sell it at the beginning, and then miss the entire bull market.
Only miners, no matter what level of miners, will hold the token from beginning to end. During the entire bull market, miners are very profitable. Miners will certainly not sell the goose that lays golden eggs like mining machines in the bull market, so miners tend to make more. The earliest miners are basically still active in the market, and their wealth is free, while the earliest holders of coins are almost gone.
submitted by paulcheung1990 to Bitcoincash [link] [comments]

Filecoin | Development Status and Mining Progress

Author: Gamals Ahmed, CoinEx Business Ambassador
https://preview.redd.it/5bqakdqgl3g51.jpg?width=865&format=pjpg&auto=webp&s=b709794863977eb6554e3919b9e00ca750e3e704
A decentralized storage network that transforms cloud storage into an account market. Miners obtain the integrity of the original protocol by providing data storage and / or retrieval. On the contrary, customers pay miners to store or distribute data and retrieve it.
Filecoin announced, that there will be more delays before its main network is officially launched.
Filecoin developers postponed the release date of their main network to late July to late August 2020.
As mentioned in a recent announcement, the Filecoin team said that the initiative completed the first round of the internal protocol security audit. Platform developers claim that the results of the review showed that they need to make several changes to the protocol’s code base before performing the second stage of the software testing process.
Created by Protocol Labs, Filecoin was developed using File System (IPFS), which is a peer-to-peer data storage network. Filecoin will allow users to trade storage space in an open and decentralized market.
Filecoin developers implemented one of the largest cryptocurrency sales in 2017. They have privately obtained over $ 200 million from professional or accredited investors, including many institutional investors.
The main network was slated to launch last month, but in February 2020, the Philly Queen development team delayed the release of the main network between July 15 and July 17, 2020.
They claimed that the outbreak of the Coronavirus (COVID-19) in China was the main cause of the delay. The developers now say that they need more time to solve the problems found during a recent codecase audit.
The Filecoin team noted the following:
“We have drafted a number of protocol changes to ensure that building our major network launch is safe and economically sound.” The project developers will add them to two different implementations of Filecoin (Lotus and go-filecoin) in the coming weeks.
Filecoin developers conducted a survey to allow platform community members to cast their votes on three different launch dates for Testnet Phase 2 and mainnet.
The team reported that the community gave their votes. Based on the vote results, the Filecoin team announced a “conservative” estimate that the second phase of the network test should begin by May 11, 2020. The main Filecoin network may be launched sometime between July 20 and August 21, 2020.
The updates to the project can be found on the Filecoin Road Map.
Filecoin developers stated:
“This option will make us get the most important protocol changes first, and then implement the rest as protocol updates during testnet.” Filecoin is back down from the final test stage.
Another filecoin decentralized storage network provider launched its catalytic test network, the final stage of the storage network test that supports the blockchain.
In a blog post on her website, Filecoin said she will postpone the last test round until August. The company also announced a calibration period from July 20 to August 3 to allow miners to test their mining settings and get an idea of how competition conditions affected their rewards.
Filecoin had announced earlier last month that the catalytic testnet test would precede its flagship launch. The delay in the final test also means that the company has returned the main launch window between August 31 and September 21.
Despite the lack of clear incentives for miners and multiple delays, Filecoin has succeeded in attracting huge interest, especially in China. Investors remained highly speculating on the network’s mining hardware and its premium price.
Mining in Filecoin
In most blockchain protocols, “miners” are network participants who do the work necessary to promote and maintain the blockchain. To provide these services, miners are compensated in the original cryptocurrency.
Mining in Filecoin works completely differently — instead of contributing to computational power, miners contribute storage capacity to use for dealing with customers looking to store data.
Filecoin will contain several types of miners:
Storage miners responsible for storing files and data on the network. Miners retrieval, responsible for providing quick tubes for file recovery. Miners repair to be carried out.
Storage miners are the heart of the network. They earn Filecoin by storing data for clients, and computerizing cipher directories to check storage over time. The probability of earning the reward reward and transaction fees is proportional to the amount of storage that the Miner contributes to the Filecoin network, not the hash power.
Retriever miners are the veins of the network. They earn Filecoin by winning bids and mining fees for a specific file, which is determined by the market value of the said file size. Miners bandwidth and recovery / initial transaction response time will determine its ability to close recovery deals on the network.
The maximum bandwidth of the recovery miners will determine the total amount of deals that it can enter into.
In the current implementation, the focus is mostly on storage miners, who sell storage capacity for FIL.

Hardware recommendations

The current system specifications recommended for running the miner are:
Compared to the hardware requirements for running a validity checker, these standards are much higher — although they definitely deserve it. Since these will not increase in the presumed future, the money spent on Filecoin mining hardware will provide users with many years of reliable service, and they pay themselves many times. Think of investing as a small business for cloud storage. To launch a model on the current data hosting model, it will cost millions of dollars in infrastructure and logistics to get started. With Filecoin, you can do the same for a few thousand dollars.
Proceed to mining
Deals are the primary function of the Filecoin network, and it represents an agreement between a client and miners for a “storage” contract.
Once the customer decides to have a miner to store based on the available capacity, duration and price required, he secures sufficient funds in a linked portfolio to cover the total cost of the deal. The deal is then published once the mine accepts the storage agreement. By default, all Filecoin miners are set to automatically accept any deal that meets their criteria, although this can be disabled for miners who prefer to organize their deals manually.
After the deal is published, the customer prepares the data for storage and then transfers it to the miner. Upon receiving all the data, the miner fills in the data in a sector, closes it, and begins to provide proofs to the chain. Once the first confirmation is obtained, the customer can make sure the data is stored correctly, and the deal has officially started.
Throughout the deal, the miner provides continuous proofs to the chain. Clients gradually pay with money they previously closed. If there is missing or late evidence, the miner is punished. More information about this can be found in the Runtime, Cut and Penalties section of this page.
At Filecoin, miners earn two different types of rewards for their efforts: storage fees and reward prevention.
Storage fees are the fees that customers pay regularly after reaching a deal, in exchange for storing data. This fee is automatically deposited into the withdrawal portfolio associated with miners while they continue to perform their duties over time, and is locked for a short period upon receipt.
Block rewards are large sums given to miners calculated on a new block. Unlike storage fees, these rewards do not come from a linked customer; Instead, the new FIL “prints” the network as an inflationary and incentive measure for miners to develop the chain. All active miners on the network have a chance to get a block bonus, their chance to be directly proportional to the amount of storage space that is currently being contributed to the network.
Duration of operation, cutting and penalties
“Slashing” is a feature found in most blockchain protocols, and is used to punish miners who fail to provide reliable uptime or act maliciously against the network.
In Filecoin, miners are susceptible to two different types of cut: storage error cut, unanimously reduce error.
Storage Error Reduction is a term used to include a wider range of penalties, including error fees, sector penalties, and termination fees. Miners must pay these penalties if they fail to provide reliability of the sector or decide to leave the network voluntarily.
An error fee is a penalty that a miner incurs for each non-working day. Sector punishment: A penalty incurred by a miner of a disrupted sector for which no error was reported before the WindowPoSt inspection.
The sector will pay an error fee after the penalty of the sector once the error is discovered.
Termination Fee: A penalty that a miner incurs when a sector is voluntary or involuntarily terminated and removed from the network.
Cutting consensus error is the penalty that a miner incurs for committing consensus errors. This punishment applies to miners who have acted maliciously against the network consensus function.
Filecoin miners
Eight of the top 10 Felticoin miners are Chinese investors or companies, according to the blockchain explorer, while more companies are selling cloud mining contracts and distributed file sharing system hardware. CoinDesk’s Wolfe Chao wrote: “China’s craze for Filecoin may have been largely related to the long-standing popularity of crypto mining in the country overall, which is home to about 65% of the computing power on Bitcoin at discretion.”
With Filecoin approaching the launch of the mainnet blocknet — after several delays since the $ 200 million increase in 2017 — Chinese investors are once again speculating strongly about network mining devices and their premium prices.
Since Protocol Labs, the company behind Filecoin, released its “Test Incentives” program on June 9 that was scheduled to start in a week’s time, more than a dozen Chinese companies have started selling cloud mining contracts and hardware — despite important details such as economics Mining incentives on the main network are still endless.
Sales volumes to date for each of these companies can range from half a million to tens of millions of dollars, according to self-reported data on these platforms that CoinDesk has watched and interviews with several mining hardware manufacturers.
Filecoin’s goal is to build a distributed storage network with token rewards to spur storage hosting as a way to drive wider adoption. Protocol Labs launched a test network in December 2019. But the tokens mined in the testing environment so far are not representative of the true silicon coin that can be traded when the main network is turned on. Moreover, the mining incentive economics on testnet do not represent how final block rewards will be available on the main network.
However, data from Blockecoin’s blocknetin testnet explorers show that eight out of 10 miners with the most effective mining force on testnet are currently Chinese miners.
These eight miners have about 15 petabytes (PB) of effective storage mining power, accounting for more than 85% of the total test of 17.9 petable. For the context, 1 petabyte of hard disk storage = 1000 terabytes (terabytes) = 1 million gigabytes (GB).
Filecoin craze in China may be closely related to the long-standing popularity of crypt mining in the country overall, which is home to about 65% of the computing power on Bitcoin by estimation. In addition, there has been a lot of hype in China about foreign exchange mining since 2018, as companies promote all types of devices when the network is still in development.
“Encryption mining has always been popular in China,” said Andy Tien, co-founder of 1475, one of several mining hardware manufacturers in Philquin supported by prominent Chinese video indicators such as Fenbushi and Hashkey Capital.
“Even though the Velikoyen mining process is more technologically sophisticated, the idea of mining using hard drives instead of specialized machines like Bitcoin ASIC may be a lot easier for retailers to understand,” he said.
Meanwhile, according to Feixiaohao, a Chinese service comparable to CoinMarketCap, nearly 50 Chinese crypto exchanges are often somewhat unknown with some of the more well-known exchanges including Gate.io and Biki — have listed trading pairs for Filecoin currency contracts for USDT.
In bitcoin mining, at the current difficulty level, one segment per second (TH / s) fragmentation rate is expected to generate around 0.000008 BTC within 24 hours. The higher the number of TH / s, the greater the number of bitcoins it should be able to produce proportionately. But in Filecoin, the efficient mining force of miners depends on the amount of data stamped on the hard drive, not the total size of the hard drive.
To close data in the hard drive, the Filecoin miner still needs processing power, i.e. CPU or GPU as well as RAM. More powerful processors with improved software can confine data to the hard drive more quickly, so miners can combine more efficient mining energy faster on a given day.
As of this stage, there appears to be no transparent way at the network level for retail investors to see how much of the purchased hard disk drive was purchased which actually represents an effective mining force.
The U.S.-based Labs Protocol was behind Filecoin’s initial coin offer for 2017, which raised an astonishing $ 200 million.
This was in addition to a $ 50 million increase in private investment supported by notable venture capital projects including Sequoia, Anderson Horowitz and Union Square Ventures. CoinDk’s parent company, CoinDk, has also invested in Protocol Labs.
After rounds of delay, Protocol Protocols said in September 2019 that a testnet launch would be available around December 2019 and the main network would be rolled out in the first quarter of 2020.
The test started as promised, but the main network has been delayed again and is now expected to launch in August 2020. What is Filecoin mining process?
Filecoin mainly consists of three parts: the storage market (the chain), the blockecin Filecoin, and the search market (under the chain). Storage and research market in series and series respectively for security and efficiency. For users, the storage frequency is relatively low, and the security requirements are relatively high, so the storage process is placed on the chain. The retrieval frequency is much higher than the storage frequency when there is a certain amount of data. Given the performance problem in processing data on the chain, the retrieval process under the chain is performed. In order to solve the security issue of payment in the retrieval process, Filecoin adopts the micro-payment strategy. In simple terms, the process is to split the document into several copies, and every time the user gets a portion of the data, the corresponding fee is paid. Types of mines corresponding to Filecoin’s two major markets are miners and warehousers, among whom miners are primarily responsible for storing data and block packages, while miners are primarily responsible for data query. After the stable operation of the major Filecoin network in the future, the mining operator will be introduced, who is the main responsible for data maintenance.
In the initial release of Filecoin, the request matching mechanism was not implemented in the storage market and retrieval market, but the takeover mechanism was adopted. The three main parts of Filecoin correspond to three processes, namely the stored procedure, retrieval process, packaging and reward process. The following figure shows the simplified process and the income of the miners:
The Filecoin mining process is much more complicated, and the important factor in determining the previous mining profit is efficient storage. Effective storage is a key feature that distinguishes Filecoin from other decentralized storage projects. In Filecoin’s EC consensus, effective storage is similar to interest in PoS, which determines the likelihood that a miner will get the right to fill, that is, the proportion of miners effectively stored in the entire network is proportional to final mining revenue.
It is also possible to obtain higher effective storage under the same hardware conditions by improving the mining algorithm. However, the current increase in the number of benefits that can be achieved by improving the algorithm is still unknown.
It seeks to promote mining using Filecoin Discover
Filecoin announced Filecoin Discover — a step to encourage miners to join the Filecoin network. According to the company, Filecoin Discover is “an ever-growing catalog of numerous petabytes of public data covering literature, science, art, and history.” Miners interested in sharing can choose which data sets they want to store, and receive that data on a drive at a cost. In exchange for storing this verified data, miners will earn additional Filecoin above the regular block rewards for storing data. Includes the current catalog of open source data sets; ENCODE, 1000 Genomes, Project Gutenberg, Berkley Self-driving data, more projects, and datasets are added every day.
Ian Darrow, Head of Operations at Filecoin, commented on the announcement:
“Over 2.5 quintillion bytes of data are created every day. This data includes 294 billion emails, 500 million tweets and 64 billion messages on social media. But it is also climatology reports, disease tracking maps, connected vehicle coordinates and much more. It is extremely important that we maintain data that will serve as the backbone for future research and discovery”.
Miners who choose to participate in Filecoin Discover may receive hard drives pre-loaded with verified data, as well as setup and maintenance instructions, depending on the company. The Filecoin team will also host the Slack (fil-Discover-support) channel where miners can learn more.
Filecoin got its fair share of obstacles along the way. Last month Filecoin announced a further delay before its main network was officially launched — after years of raising funds.
In late July QEBR (OTC: QEBR) announced that it had ceded ownership of two subsidiaries in order to focus all of the company’s resources on building blockchain-based mining operations.
The QEBR technology team previously announced that it has proven its system as a Filecoin node valid with CPU, GPU, bandwidth and storage compatibility that meets all IPFS guidelines. The QEBR test system is connected to the main Filecoin blockchain and the already mined filecoin coin has already been tested.
“The disclosure of Sheen Boom and Jihye will allow our team to focus only on the upcoming global launch of Filecoin. QEBR branch, Shenzhen DZD Digital Technology Ltd. (“ DZD “), has a strong background in blockchain development, extraction Data, data acquisition, data processing, data technology research. We strongly believe Filecoin has the potential to be a leading blockchain-based cryptocurrency and will make every effort to make QEBR an important player when Mainecoin mainnet will be launched soon”.
IPFS and Filecoin
Filecoin and IPFS are complementary protocols for storing and sharing data in a decentralized network. While users are not required to use Filecoin and IPFS together, the two combined are working to resolve major failures in the current web infrastructure.
IPFS
It is an open source protocol that allows users to store and transmit verifiable data with each other. IPFS users insist on data on the network by installing it on their own device, to a third-party cloud service (known as Pinning Services), or through community-oriented systems where a group of individual IPFS users share resources to ensure the content stays live.
The lack of an integrated catalytic mechanism is the challenge Filecoin hopes to solve by allowing users to catalyze long-term distributed storage at competitive prices through the storage contract market, while maintaining the efficiency and flexibility that the IPFS network provides.
Using IPFS
In IPFS, the data is hosted by the required data installation nodes. For data to persist while the user node is offline, users must either rely on their other peers to install their data voluntarily or use a central install service to store data.
Peer-to-peer reliance caching data may be a good thing as one or multiple organizations share common files on an internal network, or where strong social contracts can be used to ensure continued hosting and preservation of content in the long run. Most users in an IPFS network use an installation service.
Using Filecoin
The last option is to install your data in a decentralized storage market, such as Filecoin. In Filecoin’s structure, customers make regular small payments to store data when a certain availability, while miners earn those payments by constantly checking the integrity of this data, storing it, and ensuring its quick recovery. This allows users to motivate Filecoin miners to ensure that their content will be live when it is needed, a distinct advantage of relying only on other network users as required using IPFS alone.
Filecoin, powered by IPFS
It is important to know that Filecoin is built on top of IPFS. Filecoin aims to be a very integrated and seamless storage market that takes advantage of the basic functions provided by IPFS, they are connected to each other, but can be implemented completely independently of each other. Users do not need to interact with Filecoin in order to use IPFS.
Some advantages of sharing Filecoin with IPFS:
Of all the decentralized storage projects, Filecoin is undoubtedly the most interested, and IPFS has been running stably for two years, fully demonstrating the strength of its core protocol.
Filecoin’s ability to obtain market share from traditional central storage depends on end-user experience and storage price. Currently, most Filecoin nodes are posted in the IDC room. Actual deployment and operation costs are not reduced compared to traditional central cloud storage, and the storage process is more complicated.
PoRep and PoSt, which has a large number of proofs of unknown operation, are required to cause the actual storage cost to be so, in the early days of the release of Filecoin. The actual cost of storing data may be higher than the cost of central cloud storage, but the initial storage node may reduce the storage price in order to obtain block rewards, which may result in the actual storage price lower than traditional central cloud storage.
In the long term, Filecoin still needs to take full advantage of its P2P storage, convert storage devices from specialization to civil use, and improve its algorithms to reduce storage costs without affecting user experience. The storage problem is an important problem to be solved in the blockchain field, so a large number of storage projects were presented at the 19th Web3 Summit. IPFS is an important part of Web3 visibility. Its development will affect the development of Web3 to some extent. Likewise, Web3 development somewhat determines the future of IPFS. Filecoin is an IPFS-based storage class project initiated by IPFS. There is no doubt that he is highly expected.
Resources :
  1. https://www.coindesk.com/filecoin-pushes-back-final-testing-phase-announces-calibration-period-for-miners
  2. https://docs.filecoin.io/mine/#types-of-miners https://www.nasdaq.com/articles/inside-the-craze-for-filecoin-crypto-mining-in-china-2020-07-12؟amp
  3. https://www.prnewswire.com/news-releases/qebr-streamlines-holdings-to-concentrate-on-filecoin-development-and-mining-301098731.html
  4. https://www.crowdfundinsider.com/2020/05/161200-filecoin-seeks-to-boost-mining-with-filecoin-discove
  5. https://zephyrnet.com/filecoin-seeks-to-boost-mining-with-filecoin-discove
  6. https://docs.filecoin.io/introduction/ipfs-and-filecoin/#filecoin-powered-by-ipfs
submitted by CoinEx_Institution to filecoin [link] [comments]

Murmurs of the Sea | Monthly Portfolio Update - March 2020

Only the sea, murmurous behind the dingy checkerboard of houses, told of the unrest, the precariousness, of all things in this world.
-Albert Camus, The Plague
This is my fortieth portfolio update. I complete this update monthly to check my progress against my goal.
Portfolio goal
My objective is to reach a portfolio of $2 180 000 by 1 July 2021. This would produce a real annual income of about $87 000 (in 2020 dollars).
This portfolio objective is based on an expected average real return of 3.99 per cent, or a nominal return of 6.49 per cent.
Portfolio summary
Vanguard Lifestrategy High Growth Fund – $662 776
Vanguard Lifestrategy Growth Fund – $39 044
Vanguard Lifestrategy Balanced Fund – $74 099
Vanguard Diversified Bonds Fund – $109 500
Vanguard Australian Shares ETF (VAS) – $150 095
Vanguard International Shares ETF (VGS) – $29 852
Betashares Australia 200 ETF (A200) – $197 149
Telstra shares (TLS) – $1 630
Insurance Australia Group shares (IAG) – $7 855
NIB Holdings shares (NHF) – $6 156
Gold ETF (GOLD.ASX) – $119 254
Secured physical gold – $19 211
Ratesetter (P2P lending) – $13 106
Bitcoin – $115 330
Raiz* app (Aggressive portfolio) – $15 094
Spaceship Voyager* app (Index portfolio) – $2 303
BrickX (P2P rental real estate) – $4 492
Total portfolio value: $1 566 946 (-$236 479 or -13.1%)
Asset allocation
Australian shares – 40.6% (4.4% under)
Global shares – 22.3%
Emerging markets shares – 2.3%
International small companies – 3.0%
Total international shares – 27.6% (2.4% under)
Total shares – 68.3% (6.7% under)
Total property securities – 0.2% (0.2% over)
Australian bonds – 4.8%
International bonds – 10.4%
Total bonds – 15.2% (0.2% over)
Gold – 8.8%
Bitcoin – 7.4%
Gold and alternatives – 16.2% (6.2% over)
Presented visually, below is a high-level view of the current asset allocation of the portfolio.
Comments
This month saw an extremely rapid collapse in market prices for a broad range of assets across the world, driven by the acceleration of the Coronavirus pandemic.
Broad and simultaneous market falls have resulted in the single largest monthly fall in portfolio value to date of around $236 000.
This represents a fall of 13 per cent across the month, and an overall reduction of more the 16 per cent since the portfolio peak of January.
[Chart]
The monthly fall is over three times more severe than any other fall experienced to date on the journey. Sharpest losses have occurred in Australian equities, however, international shares and bonds have also fallen.
A substantial fall in the Australia dollar has provided some buffer to international equity losses - limiting these to around 8 per cent. Bitcoin has also fallen by 23 per cent. In short, in the period of acute market adjustment - as often occurs - the benefits of diversification have been temporarily muted.
[Chart]
The last monthly update reported results of some initial simplified modelling on the impact of a hypothetical large fall in equity markets on the portfolio.
Currently, the actual asset price falls look to register in between the normal 'bear market', and the more extreme 'Global Financial Crisis Mark II' scenarios modelled. Absent, at least for the immediate phase, is a significant diversification offset - outside of a small (4 per cent) increase in the value of gold.
The continued sharp equity market losses have left the portfolio below its target Australian equity weighting, so contributions this month have been made to Vanguard's Australian shares ETF (VAS). This coming month will see quarterly distributions paid for the A200, VGS and VAS exchange traded funds - totalling around $2700 - meaning a further small opportunity to reinvest following sizeable market falls.
Reviewing the evidence on the history of stock market falls
Vladimir Lenin once remarked that there are decades where nothing happen, and then there are weeks in which decades happen. This month has been four such weeks in a row, from initial market responses to the coronavirus pandemic, to unprecedented fiscal and monetary policy responses aimed at lessening the impact.
Given this, it would be foolish to rule out the potential for other extreme steps that governments have undertaken on multiple occasions before. These could include underwriting of banks and other debt liabilities, effective nationalisation or rescues of critical industries or providers, or even temporary closure of some financial or equity markets.
There is a strong appeal for comforting narratives in this highly fluid investment environment, including concepts such as buying while distress selling appears to be occurring, or delaying investing until issues become 'more clear'.
Nobody can guarantee that investments made now will not be made into cruel short-lived bear market rallies, and no formulas exist that will safely and certainly minimise either further losses, or opportunities forgone. Much financial independence focused advice in the early stages of recent market falls focused on investment commonplaces, with a strong flavour of enthusiasm at the potential for 'buying the dip'.
Yet such commonly repeated truths turn out to be imperfect and conditional in practice. One of the most influential studies of a large sample of historical market falls turns out to provide mixed evidence that buying following a fall reliably pays off. This study (pdf) examines 101 stock market declines across four centuries of data, and finds that:
Even these findings should be viewed as simply indicative. Each crisis and economic phase has its unique character, usually only discernible in retrospect. History, in these cases, should inform around the potential outlines of events that can be considered possible. As the saying goes, risk is what remains after you believe you have thought of everything.
Position fixing - alternative perspectives of progress
In challenging times it can help to keep a steady view of progress from a range of perspectives. Extreme market volatility and large falls can be disquieting for both recent investors and those closer to the end of the journey.
One perspective on what has occurred is that the portfolio has effectively been pushed backwards in time. That is, the portfolio now sits at levels it last occupied in April 2019. Even this perspective has some benefit, highlighting that by this metric all that has been lost is the strong forward progress made in a relatively short time.
Yet each perspective can hide and distort key underlying truths.
As an example, while the overall portfolio is currently valued at around the same dollar value as a year ago, it is not the same portfolio. Through new purchases and reinvestments in this period, many more actual securities (mostly units in ETFs) have been purchased.
The chart below sets out the growth in total units held from January 2019 to this month, across the three major exchange trade funds holdings in the portfolio.
[Chart]
From this it can be seen that the number of securities held - effectively, individual claims on the future earnings of the firms in these indexes - has more than doubled over the past fifteen months. Through this perspective, the accumulation of valuable assets shows a far more constant path.
Though this can help illuminate progress, as a measure it also has limitations. The realities of falls in market values cannot be elided by such devices, and some proportion of those market falls represent initial reassessments of the likely course of future earnings, and therefore the fundamental value of each of those ETF units.
With significant uncertainty over the course of global lock-downs, trade and growth, the basis of these reassessments may provide accurate, or not. For anyone to discount all of these reassessments as wholly the temporary result of irrational panic is to show a remarkable confidence in one's own analytical capacities.
Similarly, it would be equally wrong to extrapolate from market falls to a permanent constraining of the impulse of humanity to innovate, adjust to changed conditions, seek out opportunities and serve others for profit.
Lines of position - Trends in expenditure
A further longer-term perspective regularly reviewed is monthly expenses compared to average distributions.
Monthly expenditure continues to be below average, and is likely to fall further next month as a natural result of a virus-induced reduction of shopping trips, events and outings.
[Chart]
As occurred last month, as a function some previous high distributions gradually falling outside of the data 'window' for the rolling three-year comparison of distributions and expenditure, a downward slope in distributions continues.
Progress
Progress against the objective, and the additional measures I have reached is set out below.
Measure Portfolio All Assets Portfolio objective – $2 180 000 (or $87 000 pa) 71.9% 97.7% Credit card purchases – $71 000 pa 87.7% 119.2% Total expenses – $89 000 pa 70.2% 95.5%
Summary
This month has been one of the most surprising and volatile of the entire journey, with significant daily movements in portfolio value and historic market developments. There has been more to watch and observe than at any time in living memory.
The dominant sensation has been that of travelling backwards through time, and revisiting a stage of the journey already passed. The progress of the last few months has actually been so rapid, that this backwards travel has felt less like a set back, but rather more like a temporary revisitation of days past.
It is unclear how temporary a revisitation current conditions will enforce, or exactly how this will affect the rest of the journey. In early January I estimated that if equity market fell by 33 per cent through early 2020 with no offsetting gains in other portfolio elements, this could push out the achievement of the target to January 2023.
Even so, experiencing these markets and with more volatility likely, I don't feel there is much value in seeking to rapidly recalculate the path from here, or immediately alter the targeted timeframe. Moving past the portfolio target from here in around a year looks almost impossibly challenging, but time exists to allow this fact to settle. Too many other, more important, human and historical events are still playing out.
In such times, taking diverse perspectives on the same facts is important. This Next Life recently produced this interesting meditation on the future of FIRE during this phase of economic hardship. In addition, the Animal Spirits podcast also provided a thoughtful perspective on current market falls compared to 2008, as does this article by Early Retirement Now. Such analysis, and each passing day, highlights that the murmurs of the sea are louder than ever before, reminding us of the precariousness of all things.
The post, links and full charts can be seen here.
submitted by thefiexpl to fiaustralia [link] [comments]

We Just Hit $500k NW! 25 & 26 Years Old. $156k DINK Income. Here's Our Story.

Hi guys. I've been a subscriber and lurker here for over 5 years and one of my favorite things to read are the "how you got to where you are" stories. I made this post once I hit $300k, but my new wife and I just hit the $500k milestone together and we thought it would be worthwhile to share our story. We hate lack of transparency and ambiguity, so we'll be as open and honest as possible. We tried to include everything that we would want to know as if we were reading someone else's post. Feel free to ask us any questions.
There is a TLDR at the bottom of this post. You probably want to read that first to see if you're interested before investing your time in conquering this wall of text.
Also, you can skip the wall of text below about our childhood/college/relationship stuff if you're just interested in the "numbers". I just wanted to include this background to provide context.
TLDR: Two married DINKs hit $500k net worth at age 25 and 26. We both had frugal upbringings. We both got scholarships/grants to top in-state university. We both worked 2-3 jobs every week while attending college to pay for living expenses and saved the rest. I got degree in business. She got a degree in industrial engineering. We graduated with no debt and $80k saved up. I made $55k/year at my first full-time and she made $53k/year at hers. 4.5 years after graduating I currently make $86k and 2.5 years after graduating she makes $70k. Our monthly budget is $1,822/month. Our savings rate is 81%. We have been dating for 9 years, living together for 4 years, and we got married in May.
submitted by aFinancialWreck to financialindependence [link] [comments]

How and Why to Balance A.I. Spawn Locations, Density, Behaviors, and Loot Tables

A.I. stats seemingly provide a subtle effect to gameplay, but in reality these easily adjustable, “ninja stats,” contribute significantly to the overall gameplay experience in Escape From Tarkov. A.I. stat tables texture the in-raid experience by playing a key role in the economy, and impact immersion and incentive factors or resource texturing. Having purchased my first account on December 28, 2016, I have had plenty of time to take note of how these factors have affected players through every phase of development since nearly the beginning of public launch and I have determined that the subject has gone largely overlooked in discussions of the overall gameplay experience.
Scavs texture each map by determining player pathing through incentive placement. Players should see scav spawns as potential loot and exp resources that accentuate different areas of the map, but with the current deemphasized/randomized loot tables players rarely seek out these locations. This means that scavs become a mere annoyance for many players, or in some cases, they become rarely sought out locations for completing some entry level quests. Once a player has passed these quests they no longer seek out scav spawns and go straight for high tier loot spawns as these tend to attract more players. Players rarely wander the entire map searching for scavs that may or may not be there. New and casual players rarely understand the locations, behaviors, or loot tables of scavs and are therefore surprised by them as undergeared players and if they succeed find that scavs rarely provide seriously useful or fun loot.
One example of how this resource texturing is working correctly right now is in factory bathroom. Right now players are grouping up in the EFT Official Discord and grinding out scavs in the factory bathroom, and they are having a lot of fun. New players are finding consistent exp and loot, and they are able to work with and learn from more experienced players. It works on factory right now, but it not effective on all maps, and the loot tables should be tweaked significantly in order to make this more effective.
In terms of map resource texturing, I propose that scavs should spawn in an orderly way in larger groups, at specific locations, with dynamic sequential spawning. Dynamic sequential spawning means that instead of 2-3 scavs spawning at 3-4 different locations on a given map, they should spawn in groups of 5-6 in one initial location, after the first patrol is wiped a second patrol spawns at a known secondary location, once this patrol is wiped a third patrol of 5-6 scavs spawns at a known third location, and so on. There could easily be 5-10 sequential, ordered scav spawn locations on any given map. For example, on Woods, Wood Camp could be the first spawn location and they could patrol around the perimeter of the camp in a fairly tight group (say 10-15 meter spread), second could be RAUF patrolling from East to West along the road, third West Border, fourth South-V, fifth East Gate, sixth Scav House, seventh Marked Circle, eight Wood Camp again, ninth Mountain Stash, tenth Gate to Factory. By doing this you would have a total potential of 50-60 scavs spawning throughout a raid if players systematically clear patrols, and yet there would only be 5-6 scavs on the map at a given time which reduces lag and performance issues. This would also concentrate PVP around these patrols.
By providing ordered static patrols on each map players no longer find themselves wandering around the map checking an infinite number of angles, trying to decide where to go and what to do. It also disrupts the rushing of high tier loot spawns by incentivising players to engage with scavs who behave as a group in larger numbers than we have now. Texturing a map through ordered static scav spawns introduces two problems with scavs; the practical and market value of scav loot, and the RNG introduced by high pen/damage ammo that scavs can spawn with. Currently, scav loot tables are unpredictable.
Sometimes when you kill a scav you get stupidly high value items like for example you can kill a scav now and get a Fal (50k)+Keycard(100K)+Rare Key(100k-2mil)+T4 armor(50-100k)+Pilgrim(30-40k) - we are looking at a potential profit of 330k-2.3m roubles. On the other hand it is common to find scavs with a mosin and scav vest, or a pistol+vest+t-bag - pointless items that nobody cares about. One third of the time, maybe less, you get a reasonable loadout from a scav, maybe an AK or shotgun with a decent rig and an MBSS.
These “reasonable loadouts” are extremely useful to new players and can provide a little buffer to the unforgiving experience of EFT for new and casual players. Also, if you are a hardcore player looking for moderate gains you can go wipe one patrol and walk out with 4-5 shotguns and AKs, providing a reasonable 120-150k profit. Chances are the casuals will actually use the AKs and shotguns, and the hardcore players will mod the AKs and sell the shotguns, removing currency from the economy by buying mods, which is useful for balancing the economy. It’s a win-win for casual, new, and hardcore players.
This brings us the to the second issue with scavs which is the RNG created by high pen ammo that they spawn with fairly often. This needs to stop because it makes it so that casuals and hardcore players don’t want to run high tier gear at all. They know that one Bush Wookie or terminator A.I. with a mosin, vepr hunter, or Fal can delete them in one shot. This is fine if its a PMC with a mosin, but nobody should be getting one-tapped through gear by a scav, PERIOD. I don’t care if it’s more realistic. This game is not actually about realism it is about immersion which is a game-design mechanic not a realism mechanic. If you haven’t realized this about EFT yet, then simply refer to the clip of Nikita referencing the secure containers multiple times on different podcasts and videos as “Magic.” But I digress… It is about balance, fun, and intensity, not a war simulator.
Get these high tier bullets OUT of the scav loot tables completely. It does not make the game more fun for anyone. We want players to bring in high tier gear because it allows players to experience these beautifully crafted items, which is really the point of the game - the experience you get when you get to USE nice gear that you worked hard to earn. Players would be much less likely to horde currency to destabilize the economy and much more likely to bring in decent loadouts if this one issue of high pen ammo on scavs was resolved. We have players with 50-100 mil inventory values who STILL run a t3 chest and a ssh-68 every raid because they know that running anything else is too risky.
In place of these vepr hunters, Fals, and mosins, I propose a number of drastic changes to the loot tables themselves which could easily be made through “ninja tweaks.” Get rid of all high value loot from scavs and replace it with MORE moderately valued items. New players don’t need labs keycards, they don’t need vases, Fals, and other high value low practicality items. New and casual players need fun and useful items to get them through their raids. Tier 2-3 armor so they don’t get one tapped by any and every scav and pistol player with a PM or shotgun. Decent meds, especially that stop bleeds and limping. So here is my new proposed loot table for all scavs:
Meds provide a much needed buffer for new and casual players. Primary weapons should not be overly useless, such as the Toz. Primary weapons should not be overpowered, I’ve mentioned these already. Instead these primary weapons should be AK-based, or shotgun based, and they should be running PS/PP ammo or buckshot. A pack of 5-6 scavs with AKs and shotguns makes for a fun fight. It would be cool to see them switching to sidearms as they use up their ammo, and adding a pistol to every scav on top of their primary injects more mid/low tier USEFUL gear into the game so that a new player can get two raids out of killing one scav. This means that If two new players working together manage to clear two patrols of 5-6 scavs in one raid and make it out, then they are each set for at least ten failed raids.
And trust me, this is a really important metric for understanding the new player experience. Every time a new player dies they are asking themselves, “How many more failed raids do I have left before I quit this game?” This stuff doesn’t seem important to most of us who have been playing for years or those hardcore players who play eight hours a day, but it makes a huge difference for new player retention. Guns make sense to new players, especially fully automatic AKs and mp-153s. Keycards, SanKeys, car batteries, bitcoins, confuse new players and hardcore players would rather run to their static spawns than bother farming scavs for a chance at a rare item. Once these loot tables change, combined with ordered-sequential scav spawning, players will stay in raid longer, experiencing more action, and see the whole map in a way that makes sense and has some sense of predictability. This game already has plenty of RNG, plenty of risk, plenty of freedom and intensity. It is time to hone it in and bring logic to the game. Create a textured, hand-crafted experience that showcases the artistic value of this game.
These changes that I have suggested don’t even require a patch. We know that these metrics can be changed very quickly and easily and would drastically change the gameplay experience for the better. By focusing resources through these usages of scavs on each map it conditions players to think about their raids in a linear way, which goes against the survival genre, but distinguish EFT within the genre. Providing a more “on the rails” survival “theme-park” experience for each map also conditions the playerbase for what is to come for EFT: The Storyline. You don’t want players to suddenly go from total freedom to suddenly being hit with a structured storyline. You want to ease them into it, and what I propose here does just that and much more.
submitted by VirindiPuppetDT to EscapefromTarkov [link] [comments]

Quant Network: Token valuation dynamics and fundamentals

Quant Network: Token valuation dynamics and fundamentals
This post intends to illustrate the dynamics and fundamentals related to the mechanics and use of the Quant Network Utility Token (QNT), in order to provide the community with greater clarity around what holding the token actually means.
This is a follow-up on two articles David W previously wrote about Quant Network’s prospects and potential, which you can find here:
For holders not intending to use Overledger for business reasons, the primary goal of holding the QNT token is to benefit from price appreciation. Some are happy to believe that speculation will take the QNT price to much higher levels if and when large-scale adoption/implementation news comes out, whilst others may actually prefer to assess the token’s utility and analyse how it would react to various scenarios to justify a price increase based on fundamentals. The latter is precisely what I aim to look into in this article.
On that note, I have noticed that many wish to see institutional investors getting involved in the crypto space for their purchase power, but the one thing they would bring and that is most needed in my opinion is fundamental analysis and valuation expectations based on facts. Indeed, equity investors can probably access 20 or 30 reports that are 15 pages long and updated on a quarterly basis about any blue chip stock they are invested in, but how many of such (professional) analyst reports can you consult for your favorite crypto coins? Let me have a guess: none. This is unfortunate, and it is a further reason to look into the situation in more details.
To be clear, this article is not about providing figures on the expected valuation of the token, but rather about providing the community with a deeper analysis to better understand its meaning and valuation context. This includes going through the (vast) differences between a Utility Token and a Company Share since I understand it is still blurry in some people’s mind. I will incorporate my thoughts and perspective on these matters, which should not be regarded as a single source of truth but rather as an attempt to “dig deeper”.
In order to share these thoughts with you in the most pertinent manner, I have actually entirely modelled the Quant Treasury function and analysed how the QNT token would react to various scenarios based on a number of different factors. That does not mean there is any universal truth to be told, but it did help in clarifying how things work (with my understanding of the current ruleset at least, which may also evolve over time). This is an important safety net: if the intensity of speculation in crypto markets was to go lower from here, what would happen to the token price? How would Quant Treasury help support it? If the market can feel comfortable with such situation and the underlying demand for the token, then it can feel comfortable to take it higher based on future growth expectations — and that’s how it should be.
Finally, to help shed light on different areas, I must confess that I will have to go through some technicalities on how this all works and what a Utility Token actually is. That is the price to pay to gain that further, necessary knowledge and be in a position to assess the situation more thoroughly — but I will make it as readable as I possibly can, so… if are you ready, let’s start!

A Utility Token vs. a Company Share: what is the difference?

It is probably fair to say that many people involved in the crypto space are unfamiliar with certain key financial terms or concepts, simply because finance is not necessarily everyone’s background (and that is absolutely fine!). In addition, Digital Assets bring some very novel concepts, which means that everyone has to adapt in any case.
Therefore, I suggest we start with a comparison of the characteristics underpinning the QNT Utility Token and a Quant Network Company Share (as you may know, the Company Shares are currently privately held by the Quant Network founders). I believe it is important to look at this comparison for two reasons:
  1. Most people are familiar with regular Company Shares because they have been traded for decades, and it is often asked how Utility Tokens compare.
  2. Quant Network have announced a plan to raise capital to grow their business further (in the September 2019 Forbes article which you can find here). Therefore, regardless of whether the Share Offering is made public or private, I presume the community will want to better understand how things compare and the different dynamics behind each instrument.
So where does the QNT Utility Token sit in Quant Network company and how does it compare to a Quant Network Company Share? This is how it looks:
https://preview.redd.it/zgidz8ed74y31.png?width=1698&format=png&auto=webp&s=54acd2def0713b67ac7c41dae6c9ab225e5639fa
What is on the right hand side of a balance sheet is the money a company has, and what is on the left hand side is how it uses it. Broadly speaking, the money the company has may come from the owners (Equity) or from the creditors (Debt). If I were to apply these concepts to an individual (you!), “Equity” is your net worth, “Debt” is your mortgage and other debt, and “Assets” is your house, car, savings, investments, crypto, etc.
As you can see, a Company Share and a Utility Token are found in different parts of the balance sheet — and that, in itself, is a major difference! They indeed serve two very different purposes:
  • Company Shares: they represent a share of a company’s ownership, meaning that you actually own [X]% of the company ([X]% = Number of shares you possess / Total number of shares) and hence [X]% of the company’s assets on the left hand side of the balance sheet.
  • Utility Tokens: they are keys to access a given platform (in our case, Quant Network’s Operating System: Overledger) and they can serve multiple purposes as defined by their Utility Document (in QNT’s case, the latest V0.3 version can be found here).
As a consequence, as a Company Shareholder, you are entitled to receive part or all of the profits generated by the company (as the case may arise) and you can also take part in the management decisions (indeed, with 0.00000001% of Apple shares, you have the corresponding right to vote to kick the CEO out if you want to!).
On the other hand, as a Utility Token holder, you have no such rights related to the company’s profits or management, BUT any usage of the platform has to go through the token you hold — and that has novel, interesting facets.

A Utility Token vs. a Company Share: what happens in practice?

Before we dig further, let’s now remind ourselves of the economic utilities of the QNT token (i.e. in addition to signing and encrypting transactions):
  1. Licences: a licence is mandatory for anyone who wishes to develop on the Overledger platform. Enterprises and Developers pay Quant Network in fiat money and Quant Treasury subsequently sets aside QNT tokens for the same amount (a diagram on how market purchases are performed can be found on the Overledger Treasury page here). The tokens are locked for 12 months, and the current understanding is that the amount of tokens locked is readjusted at each renewal date to the prevailing market price of QNT at the time (this information is not part of the Utility Token document as of now, but it was given in a previous Telegram AMA so I will assume it is correct pending further developments).
  2. Usage: this relates to the amount of Overledger read and write activity performed by clients on an ongoing basis, and also to the transfer of Digital Assets from one chain to another, and it follows a similar principle: fiat money is received by Quant Network, and subsequently converted in QNT tokens (these tokens are not locked, however).
  3. Gateways: information about Gateways has been released through the Overledger Network initiative (see dedicated website here), and we now know that the annual cost for running a Gateway will be 500 QNT whilst Gateway holders will receive a percentage of transaction fees going through their setup.
  4. Minimum holding amounts: the team has stated that there will be a minimum QNT holding amount put in place for every participant of the Overledger ecosystem, although the details have not been released yet.
That being said, it now becomes interesting to illustrate with indicative figures what actually happens as Licences, Usage and Gateways are paid for and Quant Network company operates. The following diagram may help in this respect:
Arbitrary figures from myself (i.e. no currency, no unit), based on an indicative 20% Net Income Ratio and a 40% Dividend yield
We have now two different perspectives:
  • On the right hand side, you see the simplified Profit & Loss account (“P&L”) which incorporates Total Revenues, from which costs and taxes are deducted, to give a Net Income for the company. A share of this Net Income may be distributed to Shareholders in the form of a Dividend, whilst the remainder is accounted as retained profits and goes back to the balance sheet as Equity to fund further growth for instance. Importantly, the Dividend (if any) is usually a portion of the Net Income so, using an indicative 40% Dividend yield policy, shareholders receive here for a given year 80 out of total company revenues of 1,000.
  • On the left hand side, you see the QNT requirements arising from the Overledger-related business activity which equal 700 here. Note that this is only a portion of the Total Revenues (1,000) you can see on the right hand side, as the team generates income from other sources as well (e.g. consultancy fees) — but I assume Overledger will represent the bulk of it since it is Quant Network’s flagship product and focus. In this case, the equivalent fiat amount of QNT tokens represents 700 (i.e. 100% of Overledger-related revenues) out of the company’s Total Revenues of 1,000. It is to be noted that excess reserves of QNT may be sold and generate additional revenues for the company, which would be outside of the Overledger Revenues mentioned above (i.e. they would fall in the “Other Revenues” category).
A way to summarise the situation from a very high level is: as a Company Shareholder you take a view on the company’s total profits whereas as a Utility Token holder you take a view on the company’s revenues (albeit Overledger-related).
It is however too early to reach any conclusion, so we now need to dig one level deeper again.

More considerations around Company Shares

As we discussed, with a Company Share, you possess a fraction of the company’s ownership and hence you have access to profits (and losses!). So how do typical Net Income results look in the technology industry? What sort of Dividend is usually paid? What sort of market valuations are subsequently achieved?
Let’s find out:
https://preview.redd.it/eua9sqlt74y31.png?width=2904&format=png&auto=webp&s=3500669942abf62a0ea1c983ab3cea40552c40d1
As you can see, the typical Net Income Ratio varies between around 10% and 20% in the technology/software industry (using the above illustrated peer group). The ratio illustrates the proportion of Net Income extracted from Revenues.
In addition, money is returned to Company Shareholders in the form of a Dividend (i.e. a portion of the Net Income) and in the form of Share repurchases (whereby the company uses its excess cash position to buy back shares from Shareholders and hence diminish the number of Shares available). A company may however prefer to not redistribute any of the profits, and retain them instead to fund further business growth — Alphabet (Google) is a good example in this respect.
Interestingly, as you can see on the far right of the table, the market capitalisations of these companies reflect high multiples of their Net Income as investors expect the companies to prosper in the future and generate larger profits. If you wished to explore these ideas further, I recommend also looking into the Return on Equity ratio which takes into account the amount of resources (i.e. Capital/Equity) put to work to generate the companies’ profits.
It is also to be noted that the number of Company Shares outstanding may vary over time. Indeed, aside from Share repurchases that diminish the number of Shares available to the market, additional Shares may be issued to raise additional funds from the market hence diluting the ownership of existing Shareholders.
Finally, (regular) Company Shares are structured in the same way across companies and industries, which brings a key benefit of having them easily comparable/benchmarkable against one another for investors. That is not the case for Utility Tokens, but they come with the benefit of having a lot more flexible use cases.

More considerations around the QNT token

As discussed, the Utility Token model is quite novel and each token has unique functions designed for the system it is associated with. That does not make value assessment easy, since all Utility Tokens are different, and this is a further reason to have a detailed look into the QNT case.
https://preview.redd.it/b0xe0ogw74y31.png?width=1512&format=png&auto=webp&s=cece522cd7919125e199b012af41850df6d9e9fd
As a start, all assets that are used in a speculative way embed two components into their price:
A) one that represents what the asset is worth today, and
B) one that represents what it may be worth in the future.
Depending on whether the future looks bright or not, a price premium or a price discount may be attached to the asset price.
This is similar to what we just saw with Company Shares valuation multiples, and it is valid across markets. For instance, Microsoft generates around USD 21bn in annual Net Income these days, but the cost of acquiring it entirely is USD 1,094bn (!). This speculative effect is particularly visible in the crypto sector since valuation levels are usually high whilst usage/adoption levels are usually low for now.
So what about QNT? As mentioned, the QNT Utility model has novel, interesting facets. Since QNT is required to access and use the Overledger system, it is important to appreciate that Quant Network company has three means of action regarding the QNT token:
  1. MANAGING their QNT reserves on an ongoing basis (i.e. buying or selling tokens is not always automatic, they can allocate tokens from their own reserves depending on their liquidity position at any given time),
  2. BUYING/RECEIVING QNT from the market/clients on the back of business activity, and
  3. SELLING QNT when they deem their reserves sufficient and/or wish to sell tokens to cover for operational costs.
Broadly speaking, the above actions will vary depending on business performance, the QNT token price and the Quant Network company’s liquidity position.
We also have to appreciate how the QNT distribution will always look like, it can be broken down as follows:
https://preview.redd.it/f20h7hvz74y31.png?width=1106&format=png&auto=webp&s=f2f5b63272f5ed6e3f977ce08d7bae043851edd1
A) QNT tokens held by the QNT Community
B) QNT tokens held by Quant Network that are locked (i.e. those related to Licences)
C) QNT tokens held by Quant Network that are unlocked (i.e. those related to other usage, such as consumption fees and Gateways)
D) the minimum QNT amount held by all users of the platform (more information on this front soon)
So now that the situation is set, how would we assess Quant Network’s business activity effect on the QNT token?
STEP 1: We would need to define the range of minimum/maximum amounts of QNT which Quant Network would want to keep as liquid reserves (i.e. unlocked) on an ongoing basis. This affects key variables such as the proportion of market purchases vs. the use of their own reserves, and the amount of QNT sold back to the market. Also, interestingly, if Quant Network never wanted to keep less than, for instance, 1 million QNT tokens as liquid reserves, these 1 million tokens would have a similar effect on the market as the locked tokens because they would never be sold.
STEP 2: We would need to define the amount of revenues that are related to QNT. As we know, Overledger Licences, Usage and Gateways generate revenues converted into QNT (or in QNT directly). So the correlation is strong between revenues and QNT needs. Interestingly, the cost of a licence is probably relatively low today in order to facilitate adoption and testing, but it will surely increase over time. The same goes for usage fees, especially as we move from testing/pilot phases to mass implementation. The number of clients will also increase. The Community version of Overledger is also set to officially launch next year. More information on revenue potential can be found later in this article.
STEP 3: We would need to define an evolution of the QNT token price over time and see how things develop with regards to Quant Network’s net purchase/sale of tokens every month (i.e. tokens required - tokens sold = net purchased/sold tokens).
Once assumptions are made, what do we observe?
In an undistorted environment, there is a positive correlation between Quant Network’s QNT-related revenues and the market capitalisation they occupy (i.e. the Quant Network share of the token distribution multiplied by the QNT price). However, this correlation can get heavily twisted as the speculative market prices a premium to the QNT price (i.e. anticipating higher revenues). As we will see, a persistent discount is not really possible as Quant Treasury would mechanically have to step in with large market purchases, which would provide strong support to the QNT price.
In addition, volatility is to be added to the equation since QNT volatility is likely to be (much) higher than that of revenues which can create important year-on-year disparities. For instance, Quant Treasury may lock a lot of tokens at a low price one year, and be well in excess of required tokens the next year if the QNT token price has significantly increased (and vice versa). This is not an issue per se, but this would impact the amount of tokens bought/sold on an ongoing basis by Quant Treasury as reserves inflate/deflate.
If we put aside the distortions created by speculation on the QNT price, and the subsequent impact on the excess/deficiency of Quant Network token reserves (whose level is also pro-actively managed by the company, as previously discussed), the economic system works as follows:
High QNT price vs. Revenue levels: The value of reserves is inflated, fewer tokens need to be bought for the level of revenues generated, Quant Treasury provides low support to the QNT price, its share of the token distribution diminishes.
Low QNT price vs. Revenue levels: Reserves run out, a higher number of tokens needs to be bought for the level of revenues generated, Quant Treasury provides higher support to the QNT price, its share of the token distribution increases.
Summary table:
https://preview.redd.it/q7wgzpv384y31.png?width=2312&format=png&auto=webp&s=d8c0480cb34caf2e59615ec21ea220d81d79b153
The key here is that, whatever speculation on future revenue levels does to the token in the first place, if the QNT price was falling and reaching a level that does not reflect the prevailing revenue levels of Overledger at a given time, then Quant Treasury would require a larger amount of tokens to cover the business needs which would mean the depletion of their reserves, larger purchases from the market and strong support for the QNT price from here. This is the safety net we want to see, coming from usage! Indeed, in other words, if the QNT price went very high very quickly, Quant Treasury may not be seen buying much tokens since their reserves would be inflated BUT that fall back mechanics purely based on usage would be there to safeguard QNT holders from the QNT price falling below a certain level.
I would assume this makes sense for most, and you might now wonder why have I highlighted the bottom part about the token distribution in red? That is because there is an ongoing battle between the QNT community and Quant Treasury — and this is very interesting.
The ecosystem will show how big a share is the community willing to let Quant Network represent. The community actually sets the price for the purchases, and the token distribution fluctuates depending on the metrics we discussed. An equilibrium will be formed based on the confidence the market has in Quant Network’s future revenue generation. Moreover, the QNT community could perceive the token as a Store of Value and be happy to hold 80/90% of all tokens for instance, or it could perceive QNT as more dynamic or risky and be happy to only represent 60/70% of the distribution. Needless to say that, considering my previous articles on the potential of Overledger, I think we will tend more towards the former scenario. Indeed, if you wished to store wealth with a technology-agnostic, future proof, globally adopted, revenue-providing (through Gateways) Network of Networks on which most of the digitalised value is flowing through — wouldn’t you see QNT as an appealing value proposition?
In a nutshell, it all comes down to the Overledger revenue levels and the QNT holders’ resistence to buy pressure from Quant Treasury. Therefore, if you are confident in the Overledger revenue generation and wish to see the QNT token price go up, more than ever, do not sell your tokens!
What about the locked tokens? There will always be a certain amount of tokens that are entirely taken out of circulation, but Quant Network company will always keep additional unlocked tokens on top of that (those they receive and manage as buffer) and that means that locked tokens will always be a subset of what Quant Network possesses. I do not know whether fees will primarily be concentrated on the licencing side vs. the usage side, but if that were to be the case then it would be even better as a higher amount of tokens would be taken out of circulation for good.
Finally, as long as the company operates, the revenues will always represent a certain amount of money whereas this is not the case for profits which may not appear before years (e.g. during the first years, during an economic/business downturn, etc.). As an illustration, a company like Uber has seen vast increases in revenues since it launched but never made any profit! Therefore, the demand for the QNT token benefits from good resilience from that perspective.
Quant Network vs. QNT community — What proportion of the QNT distribution will each represent?

How much revenues can Overledger generate?

I suggest we start with the basis of what the Quant Network business is about: connecting networks together, building new-generation hyper-decentralised apps on top (called “mApps”), and creating network effects.
Network effects are best defined by Metcalfe’s law which states: “the effect of a telecommunications network is proportional to the square of the number of connected users of the system” (Source: Wikipedia). This is illustrated by the picture below, which demonstrates the increasing number of possible connections for each new user added to the network. This was also recently discussed in a YouTube podcast by QNT community members “Luke” and “Ghost of St. Miklos” which you can watch here.
Source: applicoinc.com
This means that, as Overledger continues to connect more and more DLTs of all types between themselves and also with legacy systems, the number of users (humans or machines) connected to this Network of Networks will grow substantially — and the number of possible connections between participants will in turn grow exponentially. This will increase the value of the network, and hence the level of fees associated with getting access to it. This forms the basis of expected, future revenue generation and especially in a context where Overledger remains unique as of today and embraced by many of the largest institutions in the world (see the detailed summary on the matter from community member “Seq” here).
On top of this network, multi-chain hyper-decentralised applications (‘mApps’) can be built — which are an upgrade to existing dApps that use only one chain at a time and hence only benefit from the user base and functionalities of the given chain. Overledger mApps can leverage on the users and abilities of all connected chains at the same time, horizontal scaling, the ability to write/move code in any language across chains as required, write smart contracts on blockchains that do not support them (e.g. Bitcoin), and provide easier connection to other systems. dApps have barely had any success so far, as discussed in my first article, but mApps could provide the market with the necessary tools to build applications that can complement or rival what can be found on the Apple or Google Play store.
Also, the flexibility of Overledger enables Quant Network to target a large number of industries and to connect them all together. A sample of use cases can be found in the following illustration:
https://preview.redd.it/th8edz5b84y31.png?width=2664&format=png&auto=webp&s=105dd4546f8f9ab2c66d1a5a8e9f669cef0e0614
It is to be noted that one of the use cases, namely the tokenisation of the entire world’s assets, represents a market worth hundreds of trillions of USD and that is not even including the huge amount of illiquid assets not currently traded on traditional Capital Markets which could benefit from the tokenisation process. More information on the topic can be found in my previous article fully focused on the potential of Overledger to capture value from the structural shift in the world’s assets and machine-related data/value transfers.
Finally, we can look at what well established companies with a similar technology profile have been able to achieve. Overledger is an Operating System for DLTs and legacy systems on top of which applications can be built. The comparison to Microsoft Windows and the suite of Microsoft Software running on top (e.g. Microsoft Office) is an obvious one from that perspective to gauge the longer term potential.
As you can see below, Microsoft’s flagship softwares such as Windows and Office each generate tens of billions of USD of revenues every year:
Source: Geekwire
We can also look at Oracle, the second largest Enterprise software company in the world:
Source: Statista
We can finally look at what the Apple store and the Google Play store generate, since the Quant Network “mApp store” for the community side of Overledger will look to replicate a similar business model with hyper-decentralised applications:
Source: Worldwide total revenue by app store, 2018 ($bn)
The above means total revenues of around USD 70bn in 2018 for the Apple store and Google Play store combined, and the market is getting bigger year-on-year! Also, again, these (indicative!) reference points for Overledger come in the context of the Community version of the system only, since the Enterprise version represents a separate set of verticals more comparable to the likes of Microsoft and Oracle which we just looked at.

Conclusion

I hope this article helped shed further light on the QNT token and how the various market and business parameters will influence its behavior over time, as the Quant Network business is expected to grow exponentially in the coming years.
In the recent Forbes interview, Quant Network’s CEO (Gilbert Verdian) stated : “Our potential to grow is uncapped as we change and transform industries by creating a secure layer between them at speed. Our vision is to build a mass version of what I call an internet of trust, where value can be securely transferred between global partners not relying on defunct internet security but rather that of blockchain.”.
This is highly encouraging with regards to business prospects and also in comparison to what other companies have been able to achieve since the Web as we know it today emerged (e.g. Microsoft, Google, Apple, etc.). The Internet is now entering a new phase, with DLT technology at its core, and Overledger is set to be at the forefront of this new paradigm which will surely offer a vast array of new opportunities across sectors.
I believe it is an exciting time for all of us to be part of the journey, as long as any financial commitment is made with a good sense of responsibility and understanding of what success comes down to. “Crypto” is still immature in many respects, and the emergence of a dedicated regulatory framework combined with the expected gradual, selective entrance of institutional money managers will hopefully help shed further light and protect retail token holders from the misunderstandings, misinformation and misconduct which too many have suffered from in the last years.
Thanks for your time and interest.
Appendix:
First article: “The reasons why Quant Network (QNT) will rise to the Top of the crypto sphere in the coming months”
Second article: “The potential of Quant Network’s technology to capture value from the structural shift in the World’s assets and machine-related data/value transfers”
October 2019 City AM interview of Gilbert Verdian (CEO): Click here
October 2019 Blockchain Brad interview of Gilbert Verdian (CEO): Click here
July 2019 Blockchain Brad interview of Gilbert Verdian (CEO): Click here
February 2019 Blockchain Brad interview of Gilbert Verdian (CEO): Click here
----
About the original author of the article:
My name is David and I spent years in the Investment Banking industry in London. I hold QNT tokens and the above views are based on my own thoughts and research only. I am not affiliated with the Quant Network team in any way. This is not investment advice, please do your own research and understand what you are buying before doing so. It is also my belief that more than 90% of all other crypto projects will fail because what matters is what is getting adopted; please do not put more money at risk than you can afford to lose.
submitted by mr_sonic to CryptoCurrency [link] [comments]

Buffet Predictions about Bitcoin

I just wanted to comment on this, Buffet making predictions and how they are panning out, especially his comment about Bitcoin. All I need to say is this.
You can't kill an idea.
Regardless of the value of a single crypto coin, the idea stays and the technology will integrate. Many companies/tokens will crash. That's evolution. Many will lose, many will gain. But it will not end. It will evolve. The idea stays.
How many companies exist today, and how many businesses have failed at the same time?
https://markets.businessinsider.com/currencies/news/warren-buffett-12-predictions-bitcoin-table-tennis-death-2019-4-1028123943?utm_content=buffered21f&utm_medium=social&utm_source=facebook.com&utm_campaign=buffer-bi&fbclid=iwar3nsyabrmk35g13ozx8hr28np6pr7amwgojynuzzyvgvs2blqk0qpetgvq#7-coca-cola-and-gillette7

Enjoy the day!
submitted by mojoflower to CryptoCurrency [link] [comments]

In light of the recent LN weaknesses discussion, this is a reminder that even when it works, being a “bank” as a hub is a requirement and is part of the LN specs.

From the LN blog:
“...there will be times when funds will happen to be moving in the same direction (e.g. more spending than receiving). In these situations, a routing node must maintain enough “buffer capital” to be able to wait until the flow of funds reverses and channels return to a more balanced state. Routing nodes that don’t contribute enough capital to handle periods of imbalance will experience channel exhaustion (when a node has no funds remaining in a channel) and routing failures.
This type of failure should happen relatively infrequently, because nodes that produce these routing failures will be routed around and eventually disconnected by other nodes. Thus, node operators have a strong incentive to provide enough buffer capital for the number and volume of inbound channels they’ve accepted (and implicity agreed to support).”
In other words, if they fail to provide such capital, I.e., liquidity, they’ll be routed around and eventually kicked out of the network (but muh decentralization!).
Now, we already know two things:
  1. Lightning Network hubs are going to operate on a for-profit basis, generating revenue from routing transactions at the expense of miners.
This of course butchers the delicate Nash-equilibrium which Satoshi designed in bitcoin and renders BTC an alt-coin with a skewed incentive and value-stream model.
  1. Lightning Network hubs are money processors.
Onion routing or not, since they need to stake large amounts of BTC in what’s always-online and employ watch-towers, it will be easy to coerce them into KYC/AML regulations or otherwise they’ll be literally disconnected off the internet, or worst, swatted.
With net neutrality almost dead, we know how easy it will be for ISPs to disconnect such hubs at the man’s whim.
What we still don’t know is, how large do these hubs need to be? How much bitcoin do they need to own and stake in order to operate as liquidity providing hubs without being routed around then kicked-out?
Are we talking DeutscheBank and JP Morgan scale eventually? Or is it more of the Coinbase scale? Or is it Luke-jr’s basement-fridge node?
Will Carol who is featured in the Lightning Network blog linked below really get to operate her Lightning Network hub for much longer, Ms. Stark?
https://blog.lightning.engineering/posts/2018/05/30/routing.html
“Be a bank or GTFO.” -The Lightning Network
submitted by wisequote to btc [link] [comments]

November BTC Fork - The Facts

Update 2: THE NOVEMBER SEGWIT2X HARDFORK HAS NOW BEEN CANCELLED! :D
Update: Thank you for your appreciation on this article. I decided to publish it on Medium.  
You can find the article on this link.
 
Existing Article:
With less than a dozen days left before the SegWit2X fork, I thought I'd start gathering some facts before I start forming personal opinions and speculative conclusions. I refer to the SegWit1X chain as 1X and the SegWit2X chain as 2X for simplicity, and I have looked for very simple facts and safe assumptions. Here are the dots that I gathered:  
 
• Fork at Block 494,784. Approximate time = 16th of November - see Reference 6 for exact time.  
 
The New York Agreement: The NYA involved parties representing about 83% of the then hashing power who all agreed to both hardforks - one for SegWit and another for an increased block size of 2MB (2X) within 6 months of the former. Further details in reference 1.  
 
• It is safe to assume that miners will only mine the most profitable chain (possibly several chains in differing proportions).  
• If whales pump a single chain it will gain more value. If this happens, miners will be more inclined to mine that particular chain only. This will result in the other chain(s)potentially losing overall mining attractiveness.  
 
1X will continue to have a 1MB block and SegWit;  
2X will have a 2MB block and SegWit;  
Bitcoin Cash (Just for info right now) currently has an 8 MB block with NO SegWit;  
 
Current Price Status (Futures) on BitFinex: 2X/BTC = 0.17; 1X/BTC = 0.83  
 
Current Mining Status: 2X = Around 85% of blocks are signalling for 2X.  
It seems only a few mining pools including Slush Pool, F2Pool and Kano CKPool are not signalling Segwit2X. All Antpool (Jihan Wu) owned pools are signalling for Segwit2X and will likely continue to do so up to the fork. It is not clear if any other pools from the Segwit2X signalling group will change their minds in the meantime.  
 
Lower mining power chain: Likely to be 1X. Fees likely to be extremely high as not many miners. Difficulty adjustment could take a few weeks, if not months. Until then it will be very difficult to transfer funds. [It may be better to keep BTC on an exchange before fork, to ease liquidity cost/time if you want to sell either of the coins immediately]  
 
Double-spending: Miners (from 2X) will have an ability and incentive to double-spend on the minority chain (lower mining power chain). If you have huge mining power, you can allocate some of it to just double-spend on the minority chain. Some people will possibly lose confidence in the minority chain as a result.  
 
Replay-Protection: Neither 1X nor 2X currently have replay protection.  
 
Exchanges:
  1. Bitfinex: original chain is “BTC”, SegWit2x chain is “B2X”  
  2. BitMEX: Original chain is BTC  
  3. Bitstamp: Unknown  
  4. GDAX & Coinbase: hash power and market cap decides which chain is “BTC”  
  5. Kraken: Unknown  
  6. HitBTC: original chain is “BTC”, SegWit2x chain is “B2X”  
  7. CoinsBank: Original chain is BTC  
  8. CEX.IO: original chain is “BTC”, SegWit2x chain is “B2X”  
  9. Gemini: hash power decides which chain is “BTC”  
  10. Coinfloor: Unknown  
  11. BTCC (Updated on Twitter): BTCC will consider which of 1MB and 2MB to name as #bitcoin based on market feedback and adoption.  
Further details in reference 4.  
 
The OPINIONs section
Vinny Lingham's opinion: 2X will outcompete 1X.  
 
Enter Bitcoin Cash: A review by Ryan X. Charles who has incorporated some of Vinny Lingham's quotes, states the following:  
 
a. BCH is a fork of BTC with same PoW, but with improved Difficulty Adjustment Algorithm (DAA). BCH cannot die, but 1X and 2X could both die. If whales shift most of their holdings to BCH (or another coin), that would incentivise the miners to mine BCH (or another coin) instead of 1X and 2X. Both 1X and 2X would lose their mining power; however Core would release an emergency update to software adding DAA like BCH (or another coin). Thus, 1X would survive, and 2X (which might not get DAA) would die.  
 
b. If 2X continues to be the dominantly mined chain, 1X will be forced to launch an emergency update to their PoW with DAA. There could be fighting between the two chains, and as a result a struggle to become dominant potentially causing altcoins to flourish.  
 
My observations
BCH is upgrading their EDA (Emergency Difficulty Adjuster) on Nov 13. See website. This will lead to reduced volatility in BCH - likely making it more attractive to more long-term miners.  
 
Mining profitability: It is currently almost equally profitable to mine either BTC or BCH.  
 
• What to keep and eye on before the fork to judge yourself where the fate of BTC is heading.  
  1. Mining signalling distribution
  2. DAA: 1X or 2X software updates to implement Difficulty Adjustment Algorithms
  3. Futures price before fork
  4. Significant whale movement
 
References:  
  1. New York Agreement  
  2. Hashing Distribution  
  3. Ryan X. Charles's opinions  
  4. Exchange listings for both chains  
  5. Interview with Vinny Lingham  
  6. 2X Split Countdown
 
Update: Thank you for your appreciation on this article. I decided to publish it on Medium.  
You can find the article on this link.
submitted by tenmillionsterling to CryptoMarkets [link] [comments]

Threadripper KVM GPU Passthru: Testers needed

TL;DR: Check update 8 at the bottom of this post for a fix if you don't care about the history of this issue.
For a while now it has been apparent that PCI GPU passthrough using VFIO-PCI and KVM on Threadripper is a bit broken.
This manifests itself in a number of ways: When starting a VM with a passthru GPU it will either crash or run extremely slowly without the GPU ever actually working inside the VM. Also, once a VM has been booted the output of lspci on the host changes from one kind of output to another. Finally the output of dmesg suggests an issue bringing the GPU up from D0 to D3 power state.
An example of this lspci before and after VM start, as well as dmesg kernel buffer output is included here for the 7800GTX:
08:00.0 VGA compatible controller: NVIDIA Corporation G70 [GeForce 7800 GTX] (rev a1) (prog-if 00 [VGA controller]) [ 121.409329] virbr0: port 1(vnet0) entered blocking state [ 121.409331] virbr0: port 1(vnet0) entered disabled state [ 121.409506] device vnet0 entered promiscuous mode [ 121.409872] virbr0: port 1(vnet0) entered blocking state [ 121.409874] virbr0: port 1(vnet0) entered listening state [ 122.522782] vfio-pci 0000:08:00.0: enabling device (0000 -> 0003) [ 123.613290] virbr0: port 1(vnet0) entered learning state [ 123.795760] vfio_bar_restore: 0000:08:00.0 reset recovery - restoring bars ... [ 129.534332] vfio-pci 0000:08:00.0: Refused to change power state, currently in D3 08:00.0 VGA compatible controller [0300]: NVIDIA Corporation G70 [GeForce 7800 GTX] [10de:0091] (rev ff) (prog-if ff) !!! Unknown header type 7f Kernel driver in use: vfio-pci 
Notice that lspci reports revision FF and can no longer read the header type correctly. Testing revealed that pretty much all graphics cards except Vega would exhibit this behavior, and indeed the output is very similar to the above.
Reddit user wendelltron and others suggested that the D0->D3 transition was to blame. After having gone through a brute-force exhaustive search of the BIOS, kernel and vfio-pci settings for power state transitions it is safe to assume that this is probably not the case since none of it helped.
AMD representative AMD_Robert suggested that only GPUs with EFI-compatible BIOS should be able to be used for passthru in an EFI environment, however, testing with a modern 1080GTX with EFI bios support failed in a similar way:
42:00.0 VGA compatible controller: NVIDIA Corporation Device 1b80 (rev a1) and then 42:00.0 VGA compatible controller: NVIDIA Corporation Device 1b80 (rev ff) (prog-if ff) !!! Unknown header type 7f 
Common to all the cards was that they would be unavailable in any way until the host system had been restarted. Any attempt at reading any register or configuration from the card would result in all-1 bits (or FF bytes). The bitmask used for the headers may in fact be what is causing the 7f header type (and not an actual header being read from the card). Not even physically unplugging and re-plugging the card, rescanning the PCIe bus (with /sys/bus/pci/rescan) would trigger any hotplug events or update the card info. Similarly, starting the system without the card and plugging it in would not be reflected in the PCIe bus enumeration. Some cards, once crashed, would show spurious PCIe ACS/AER errors, suggesting an issue with the PCIe controller and/or the card itself. Furthermore, the host OS would be unable to properly shut down or reboot as the kernel would hang when everything else was shut down.
A complete dissection of the vfio-pci kernel module allowed further insight into the issue. Stepping through VM initialization one line at a time (yes this took a while) it became clear that the D3 power issue may be a product of the FF register issue and that the actual instruction that kills the card may have happened earlier in the process. Specifically, the function drivers/vfio/pci/vfio_pci.c:vfio_pci_ioctl, which handles requests from userspace, has entries for VFIO_DEVICE_GET_PCI_HOT_RESET_INFO and VFIO_DEVICE_PCI_HOT_RESET and the following line of code is exactly where the cards go from active to "disconnected" states:
if (!ret) /* User has access, do the reset */ ret = slot ? pci_try_reset_slot(vdev->pdev->slot) : pci_try_reset_bus(vdev->pdev->bus); 
Commenting out this line allows the VM to boot and the GPU driver to install. Unfortunately for the nVidia cards my testing stopped here as the driver would report the well known error 43/48 for which they should be ashamed and shunned by the community. For AMD cards a R9 270 was acquired for further testing.
The reason this line is in vfio-pci is because VMs do not like getting an already initialized GPU during boot. This is a well-known problem with a number of other solutions available. By disabling the line it is neccessary to use one of the other solutions when restarting a VM. For Windows you can disable the device in Device Manager before reboot/shutdown and re-enable it again after the restart - or use login/logoff scripts to have the OS do it automatically.
Unfortunately another issue surfaced which made it clear that the VMs could only be stopped once even though they could now be rebooted many times. Once they were shut down the cards would again go into the all FF "disconnect" state. Further dissection of vfio-pci revealed another instance where an attempt to reset the slot that the GPU is in was made: in drivers/vfio/pci/vfio_pci.c:vfio_pci_try_bus_reset
if (needs_reset) ret = slot ? pci_try_reset_slot(vdev->pdev->slot) : pci_try_reset_bus(vdev->pdev->bus); 
When this line is instead skipped, a VM that has had its GPU properly disabled via Device Manager and has been properly shutdown is able to be re-launched or have another VM using the same GPU launched and works as expected.
I do not understand the underlying cause of the actual issue but the workaround seems to work with no issues except the annoyance of having to disable/re-enable the GPU from within the guest (like in ye olde days). Only speculation can be given to the real reason of this fault; the hot-reset info gathered by the ioctl may be wrong, but the ACS/AER errors suggest that the issue may be deeper in the system - perhaps the PCIe controller does not properly re-initialize the link after hot-reset just as it (or the kernel?) doesn't seem to detect hot-plug events properly even though acpihp supposedly should do that in this setup.
Here is a "screenshot" of Windows 10 running the Unigine Valley benchmark inside a VM with a Linux Mint host using KVM on Threadripper 1950x and an R9 270 passed through on an Asrock X399 Taichi with 1080GTX as host GPU:
https://imgur.com/a/0HggN
This is the culmination of many weeks of debugging. It is interesting to hear if anyone else is able to reproduce the workaround and can confirm the results. If more people can confirm this then we are one step closer to fixing the actual issue.
If you are interested in buying me a pizza, you can do so by throwing some Bitcoin in this direction: 1KToxJns2ohhX7AMTRrNtvzZJsRtwvsppx
Also, English is not my native language so feel free to ask if something was unclear or did not make any sense.
Update 1 - 2017-12-05:
Expanded search to non-gpu cards and deeper into the system. Taking memory snapshots of pcie bus for each step and comparing to expected values. Seem to have found something that may be the root cause of the issue. Working on getting documentation and creating a test to see if this is indeed the main problem and to figure out if it is a "feature" or a bug. Not allowing myself to be optimistic yet but it looks interesting, it looks fixable at multiple levels.
Update 2 - 2017-12-07:
Getting a bit closer to the real issue. The issue seems to be that KVM performs a bus reset on the secondary side of the pcie bridge above the GPU being passed through. When this happens there is an unintended side effect that the bridge changes its state somehow. It does not return in a useful configuration as you would expect and any attempt to access the GPU below it results in errors.
Manually storing the bridge 4k configuration space before the bus reset and restoring it immediately after the bus reset seems to magically bring the bridge into the expected configuration and passthru works.
The issue could probably be fixed in firmware but I'm trying to find out what part of the configuration space is fixing the issue and causing the bridge to start working again. With that information it will be possible to write a targeted patch for this quirk.
Update 3 - 2017-12-10:
Begun further isolation of what particular registers in the config space are affected unintentionally by the secondary bus reset on the bridge. This is difficult work because the changes are seemingly invisible to the kernel, they happen only in the hardware.
So far at least registers 0x19 (secondary bus number) and 0x1a (subordinate bus number) are out of sync with the values in the config space. When a bridge is in faulty mode, writing their already existing value back to them brings the bridge back into working mode.
Update 4 - 2017-12-11 ("the ugly patch"):
After looking at the config space and trying to figure out what bytes to restore from before the reset and what bytes to set to something new it became clear that this would be very difficult without knowing more about the bridge.
Instead a different strategy was followed: Ask the bridge about its current config after reset and then set its current config to what it already is; byte by byte. This brings the config space and the bridge back in sync and everything, including reset/reboot/shutdown/relaunch without scripts inside the VM, now seems to work with the cards acquired for testing. Here is the ugly patch for the brave souls who want to help test it.
Please, if you already tested the workaround: revert your changes and confirm that the bug still exists before testing this new ugly patch:
In /drivers/pci/pci.c, replace the function pci_reset_secondary_bus with this alternate version that adds the ugly patch and two variables required for it to work:
void pci_reset_secondary_bus(struct pci_dev *dev) { u16 ctrl; int i; u8 mem; pci_read_config_word(dev, PCI_BRIDGE_CONTROL, &ctrl); ctrl |= PCI_BRIDGE_CTL_BUS_RESET; pci_write_config_word(dev, PCI_BRIDGE_CONTROL, ctrl); /* * PCI spec v3.0 7.6.4.2 requires minimum Trst of 1ms. Double * this to 2ms to ensure that we meet the minimum requirement. */ msleep(2); ctrl &= ~PCI_BRIDGE_CTL_BUS_RESET; pci_write_config_word(dev, PCI_BRIDGE_CONTROL, ctrl); // The ugly patch for (i = 0; i < 4096; i++){ pci_read_config_byte(dev, i, &mem); pci_write_config_byte(dev, i, mem); } /* * Trhfa for conventional PCI is 2^25 clock cycles. * Assuming a minimum 33MHz clock this results in a 1s * delay before we can consider subordinate devices to * be re-initialized. PCIe has some ways to shorten this, * but we don't make use of them yet. */ ssleep(1); } 
The idea is to confirm that this ugly patch works and then beautify it, have it accepted into the kernel and to also deliver technical details to AMD to have it fixed in BIOS firmware.
Update 5 - 2017-12-20:
Not dead yet!
Primarily working on communicating the issue to AMD. This is slowed by the holiday season setting in. Their feedback could potentially help make the patch a lot more acceptable and a lot less ugly.
Update 6 - 2018-01-03 ("the java hack"):
AMD has gone into some kind of ninja mode and has not provided any feedback on the issue yet.
Due to popular demand a userland fix that does not require recompiling the kernel was made. It is a small program that runs as any user with read/write access to sysfs (this small guide assumes "root"). The program monitors any PCIe device that is connected to VFIO-PCI when the program starts, if the device disconnects due to the issues described in this post then the program tries to re-connect the device by rewriting the bridge configuration.
This program pokes bytes into the PCIe bus. Run this at your own risk!
Guide on how to get the program:
If you have any PCI devices using VFIO-PCI the program will output something along the lines of this:
------------------------------------------- Zen PCIe-Bridge BAConfig Recovery Tool, rev 1, 2018, HyenaCheeseHeads ------------------------------------------- Wed Jan 03 21:40:30 CET 2018: Detecting VFIO-PCI devices Wed Jan 03 21:40:30 CET 2018: Device: /sys/devices/pci0000:40/0000:40:01.3/0000:42:00.0 Wed Jan 03 21:40:30 CET 2018: Bridge: /sys/devices/pci0000:40/0000:40:01.3 Wed Jan 03 21:40:30 CET 2018: Device: /sys/devices/pci0000:00/0000:00:01.3/0000:08:00.1 Wed Jan 03 21:40:30 CET 2018: Bridge: /sys/devices/pci0000:00/0000:00:01.3 Wed Jan 03 21:40:30 CET 2018: Device: /sys/devices/pci0000:40/0000:40:01.3/0000:42:00.1 Wed Jan 03 21:40:30 CET 2018: Bridge: /sys/devices/pci0000:40/0000:40:01.3 Wed Jan 03 21:40:30 CET 2018: Device: /sys/devices/pci0000:00/0000:00:01.3/0000:08:00.0 Wed Jan 03 21:40:30 CET 2018: Bridge: /sys/devices/pci0000:00/0000:00:01.3 Wed Jan 03 21:40:30 CET 2018: Monitoring 4 device(s)... 
And upon detecting a bridge failure it will look like this:
Wed Jan 03 21:40:40 CET 2018: Lost contact with /sys/devices/pci0000:00/0000:00:01.3/0000:08:00.1 Wed Jan 03 21:40:40 CET 2018: Recovering 512 bytes Wed Jan 03 21:40:40 CET 2018: Bridge config write complete Wed Jan 03 21:40:40 CET 2018: Recovered bridge secondary bus Wed Jan 03 21:40:40 CET 2018: Re-acquired contact with /sys/devices/pci0000:00/0000:00:01.3/0000:08:00.1 
This is not a perfect solution but it is a stopgap measure that should allow people who do not like compiling kernels to experiment with passthru on Threadripper until AMD reacts in some way. Please report back your experience, I'll try to update the program if there are any issues with it.
Update 7 - 2018-07-10 ("the real BIOS fix"):
Along with the upcoming A.G.E.S.A. update aptly named "ThreadRipperPI-SP3r2 1.0.0.6" comes a very welcome change to the on-die PCIe controller firmware. Some board vendors have already released BETA BIOS updates with it and it will be generally available fairly soon it seems.
Initial tests on a Linux 4.15.0-22 kernel now show PCIe passthru working phenomenally!
With this change it should no longer be necessary to use any of the ugly hacks from previous updates of this thread, although they will be left here for archival reasons.
Update 8 - 2018-07-25 ("Solved for everyone?"):
Most board vendors are now pushing out official (non-BETA) BIOS updates with AGESA "ThreadRipperPI-SP3r2 1.1.0.0" including the proper fix for this issue. After updating you no longer need to use any of the temporary fixes from this thread. The BIOS updates comes as part of the preparations for supporting the Threadripper 2 CPUs which are due to be released in a few weeks from now.
Many boards support updating over the internet directly from BIOS, but in case you are a bit old-fashioned here are the links (please double-check that I linked you the right place before flashing):
Vendor Board Update Link
Asrock X399 Taichi Update to 2.3, then 3.1
Asrock X399M Taichi Update to 1.10 then 3.1
Asrock X399 Fatality Profesional Gaming Update to 2.1 then 3.1
Gigabyte X399 AURUS Gaming 7 r1 Update to F10
Gigabyte X399 DESIGNARE EX r1 Update to F10
Asus PRIME X399-A Possibly fixed in 0601 (TR2 support and sure fix inbound soon)
Asus X399 RoG Zenith Extreme Possibly fixed in 0601 (TR2 support and sure fix inbound soon)
Asus RoG Strix X399-E Gaming Possibly fixed in 0601 (TR2 support and sure fix inbound soon)
MSI X399 Gaming Pro Carbon AC Update to Beta BIOS 7B09v186 (TR2 update inbound soon)
MSI X399 SLI plus Update to Beta BIOS 7B09vA35 (TR2 update inbound soon)
submitted by HyenaCheeseHeads to Amd [link] [comments]

Warren Buffett: Just Looking At The Price Is Not Investing ... The Bitcoin massacre is making mining VERY PROFITABLE!!! What It Was Like MINING Cryptocurrency Full-Time For A ... Warren Buffett reveals his investment strategy and ... Buffer Overflow Attack - Computerphile - YouTube

Allowing a small buffer above the Metcalfe’s Law prices may give a better representation of fair value. Both DAA and TV suggest Bitcoin is Overpriced. A case can be made that the gap isn’t ... Dollar Inflation & Bitcoin’s Buffer. Patrick Tan 22/06/2020 3. The dollar’s depreciation is at risk of sparking off inflation and the primary beneficiary of that would be gold and Bitcoin. When Paxton Cayne first moved into his 19th century brownstone on Jefferson Avenue in New York’s Bronx borough, he had a job at one of the top banks in Manhattan and wore sharp Brooks Brothers suits to ... Schwarzach am Main (www.zertifikatecheck.de) - Der SAP-Konkurrent Oracle (ISIN US68389X1054 / WKN 871460) konnte seinen Umsatz im abgelaufenen Quartal... The next time some so called experts spread statements in the media related to crypto value, if it is a fraud or not, the Bitcoin bubble, without even having understood the basics of the blockchain and cryptocurrency world, I think it is a good time to bring up some short 'funny' facts no one is talking about when it comes to crypto bashing:If you have any cool material, drop us a message ;) Before I started the process with Buffer in getting paid in bitcoin, I took stock and chatted with my girlfriend about what seemed feasible in terms of holding bitcoins as a currency. Bitcoin is infamous for its volatility, and at this point getting paid fully in bitcoin would not make sense for me. It’s just too risky. I decided as an experiment to start off with getting only 5% of my ...

[index] [10951] [8958] [29719] [17490] [38814] [11595] [26987] [30151] [22924] [13775]

Warren Buffett: Just Looking At The Price Is Not Investing ...

Enjoy the videos and music you love, upload original content, and share it all with friends, family, and the world on YouTube. Buy your next home for only one Bitcoin!? This could be a reality, as the next housing market crash is set to begin this year! Home prices have inflated quicker than their underlying fundamentals ... Warren Buffett is the godfather of modern-day investing. For nearly 50 years, Buffett has run Berkshire Hathaway, which owns over 60 companies, like Geico an... Warren buffet ne 0 Rs ko kaise convert kiya $85 dollars me. Dosto is video me humne Warren Buffett ki Biography ke baare me discuss kiya hai ki kaise ek ladk... Sometimes the scientific method takes us to new frontiers. Thanks to Bill and Melinda Gates for partnering with me on this video. Check out the 2020 letter h...

#