A proposal of Bitcoin Cash difficulty adjustment algo · GitHub

A criticism of the article "Six monetarist errors: why emission won't feed inflation"

(be gentle, it's my first RI attempt, :P; I hope I can make justice to the subject, this is my layman understanding of many macro subjects which may be flawed...I hope you can illuminate me if I have fallen short of a good RI)
Introduction
So, today a heterodox leaning Argentinian newspaper, Ambito Financiero, published an article criticizing monetarism called "Six monetarist errors: why emission won't feed inflation". I find it doesn't properly address monetarism, confuses it with other "economic schools" for whatever the term is worth today and it may be misleading, so I was inspired to write a refutation and share it with all of you.
In some ways criticizing monetarism is more of a historical discussion given the mainstream has changed since then. Stuff like New Keynesian models are the bleeding edge, not Milton Friedman style monetarism. It's more of a symptom that Argentinian political culture is kind of stuck in the 70s on economics that this things keep being discussed.
Before getting to the meat of the argument, it's good to have in mind some common definitions about money supply measures (specifically, MB, M1 and M2). These definitions apply to US but one can find analogous stuff for other countries.
Argentina, for the lack of access to credit given its economic mismanagement and a government income decrease because of the recession, is monetizing deficits way more than before (like half of the budget, apparently, it's money financed) yet we have seen some disinflation (worth mentioning there are widespread price freezes since a few months ago). The author reasons that monetary phenomena cannot explain inflation properly and that other explanations are needed and condemns monetarism. Here are the six points he makes:
1.Is it a mechanical rule?
This way, we can ask by symmetry: if a certainty exists that when emission increases, inflation increases, the reverse should happen when emission becomes negative, obtaining negative inflation. Nonetheless, we know this happens: prices have an easier time increasing and a lot of rigidity decreasing. So the identity between emission and inflation is not like that, deflation almost never exists and the price movement rhythm cannot be controlled remotely only with money quantity. There is no mechanical relationship between one thing and the other.
First, the low hanging fruit: deflation is not that uncommon, for those of you that live in US and Europe it should be obvious given the difficulties central banks had to achieve their targets, but even Argentina has seen deflation during its depression 20 years ago.
Second, we have to be careful with what we mean by emission. A statement of quantity theory of money (extracted from "Money Growth and Inflation: How Long is the Long-Run?") would say:
Inflation occurs when the average level of prices increases. Individual price increases in and of themselves do not equal inflation, but an overall pattern of price increases does. The price level observed in the economy is that which leads the quantity of money supplied to equal the quantity of money demanded. The quantity of money supplied is largely controlled by the [central bank]. When the supply of money increases or decreases, the price level must adjust to equate the quantity of money demanded throughout the economy with the quantity of money supplied. The quantity of money demanded depends not only on the price level but also on the level of real income, as measured by real gross domestic product (GDP), and a variety of other factors including the level of interest rates and technological advances such as the invention of automated teller machines. Money demand is widely thought to increase roughly proportionally with the price level and with real income. That is, if prices go up by 10 percent, or if real income increases by 10 percent, empirical evidence suggests people want to hold 10 percent more money. When the money supply grows faster than the money demand associated with rising real incomes and other factors, the price level must rise to equate supply and demand. That is, inflation occurs. This situation is often referred to as too many dollars chasing too few goods. Note that this theory does not predict that any money-supply growth will lead to inflation—only that part of money supply growth that exceeds the increase in money demand associated with rising real GDP (holding the other factors constant).
So it's not mere emission, but money supply growing faster than money demand which we should consider. So negative emission is not necessary condition for deflation in this theory.
It's worth mentioning that the relationship with prices is observed for a broad measure of money (M2) and after a lag. From the same source of this excerpt one can observe in Fig. 3a the correlation between inflation and money growth for US becomes stronger the longer data is averaged. Price rigidities don't have to change this long term relationship per se.
But what about causality and Argentina? This neat paper shows regressions in two historical periods: 1976-1989 and 1991-2001. The same relationship between M2 and inflation is observed, stronger in the first, highly inflationary period and weaker in the second, more stable, period. The regressions a 1-1 relationship in the high inflation period but deviates a bit in the low inflation period (yet the relationship is still there). Granger causality, as interpreted in the paper, shows prices caused money growth in the high inflation period (arguably because spending was monetized) while the reverse was true for the more stable period.
So one can argue that there is a mechanical relationship, albeit one that is more complicated than simple QTOM theory. The relationship is complicated too for low inflation economies, it gets more relevant the higher inflation is.
Another point the author makes is that liquidity trap is often ignored. I'll ignore the fact that you need specific conditions for the liquidity trap to be relevant to Argentina and address the point. Worth noting that while market monetarists (not exactly old fashioned monetarists) prefer alternative explanations for monetary policy with very low interest rates, this phenomena has a good monetary basis, as explained by Krugman in his famous japanese liquidity trap paper and his NYT blog (See this and this for some relevant articles). The simplified version is that while inflation may follow M2 growth with all the qualifiers needed, central banks may find difficulties targeting inflation when interest rates are low and agents are used to credible inflation targets. Central banks can change MB, not M2 and in normal times is good enough, but at those times M2 is out of control and "credibly irresponsible" policies are needed to return to normal (a more detailed explanation can be found in that paper I just linked, go for it if you are still curious).
It's not like monetary policy is not good, it's that central banks have to do very unconventional stuff to achieve in a low interest rate environment. It's still an open problem but given symmetric inflation targeting policies are becoming more popular I'm optimistic.
2 - Has inflation one or many causes?
In Argentina we know that the main determinant of inflation is dollar price increases. On that, economic concentration of key markets, utility price adjustments, fuel prices, distributive struggles, external commodity values, expectatives, productive disequilibrium, world interest rates, the economic cycle, stationality and external sector restrictions act on it too.
Let's see a simple example: during Macri's government since mid 2017 to 2019 emission was practically null, but when in 2018 the dollar value doubled, inflation doubled too (it went from 24% to 48% in 2018) and it went up again a year later. We see here that the empirical validity of monetarist theory was absent.
For the first paragraph, one could try to run econometric tests for all those variables, at least from my layman perspective. But given that it doesn't pass the smell test (has any country used that in its favor ignoring monetary policy? Also, I have shown there is at least some evidence for the money-price relationship before), I'll try to address what happened in Macri's government and if monetarism (or at least some reasonable extension of it) cannot account for it.
For a complete description of macroeconomic policy on that period, Sturzenegger account is a good one (even if a bit unreliable given he was the central banker for that government and he is considered to have been a failure). The short version is that central banks uses bonds to manage monetary policy and absorb money; given the history of defaults for the country, the Argentinian Central Bank (BCRA) uses its own peso denominated bonds instead of using treasury bonds. At that time period, the BCRA still financed the treasury but the amount got reduced. Also, it emitted pesos to buy dollar reserves, then sterilized them, maybe risking credibility further.
Near the end of 2017 it was evident the government had limited appetite for budget cuts, it had kind of abandoned its inflation target regime and the classic problem of fiscal dominance emerged, as it's shown in the classic "Unpleasant monetarist arithmetic" paper by Wallace and Sargent. Monetary policy gets less effective when the real value of bonds falls, and raising interest rates may be counterproductive in that environment. Rational expectations are needed to complement QTOM.
So, given that Argentina promised to go nowhere with reform, it was expected that money financing would increase at some point in the future and BCRA bonds were dumped in 2018 and 2019 as their value was perceived to have decreased, and so peso demand decreased. It's not that the dollar value increased and inflation followed, but instead that peso demand fell suddenly!
The IMF deal asked for MB growth to be null or almost null but that doesn't say a lot about M2 (which it's the relevant variable here). Without credible policies, the peso demand keeps falling because bonds are dumped even more (see 2019 for a hilariously brutal example of that).
It's not emission per se, but rather that it doesn't adjust properly to peso demand (which is falling). That doesn't mean increasing interest rates is enough to achieve it, following Wallace and Sargent model.
This is less a strict proof that a monetary phenomenon is involved and more stating that the author hasn't shown any problem with that, there are reasonable models for this situation. It doesn't look like an clear empirical failure to me yet.
3 - Of what we are talking about when we talk about emission?
The author mentions many money measures (M0, M1, M2) but it doesn't address it meaningfully as I tried to do above. It feels more like a rhetorical device because there is no point here except "this stuff exists".
Also, it's worth pointing that there are actual criticisms to make to Friedman on those grounds. He failed to forecast US inflation at some points when he switched to M1 instead of using M2, although he later reverted that. Monetarism kind of "failed" there (it also "failed" in the sense that modern central banks don't use money, but instead interest rates as their main tool; "failed" because despite being outdated, it was influential to modern central banking). This is often brought to this kind of discussions like if economics hasn't moved beyond that. For an account of Friedman thoughts on monetary policies and his failures, see this.
4 - Why do many countries print and inflation doesn't increase there?
There is a mention about the japanese situation in the 90s (the liquidity trap) which I have addressed.
The author mentions that many countries "printed" like crazy during the pandemic, and he says:
Monetarism apologists answer, when confronted with those grave empirical problems that happen in "serious countries", that the population "trusts" their monetary authorities, even increasing the money demand in those place despite the emission. Curious, though, it's an appeal to "trust" implying that the relationship between emission and inflation is not objective, but subjective and cultural: an appreciation that abandons mechanicism and the basic certainty of monetarism, because evaluations and diagnostics, many times ideologic, contextual or historical intervene..
That's just a restatement of applying rational expectations to central bank operations. I don't see a problem with that. Rational expectations is not magic, it's an assessment of future earnings by economic actors. Humans may not 100% rational but central banking somehow works on many countries. You cannot just say that people are ideologues and let it at that. What's your model?
Worth noting the author shills for bitcoin a bit in this section, for more cringe.
5 - Are we talking of a physical science or a social science?
Again, a vague mention of rational expectations ("populists and pro market politicians could do the same policies with different results because of how agents respond ideologically and expectatives") without handling the subject meaningfully. It criticizes universal macroeconomic rules that apply everywhere (this is often used to dismiss evidence from other countries uncritically more than as a meaningful point).
6 - How limits work?
The last question to monetarism allows to recognize it something: effectively we can think on a type of vinculation between emission and inflation in extreme conditions. That means, with no monetary rule, no government has the need of taxes but instead can emit and spend all it needs without consequence. We know it's not like that: no government can print infinitely without undesirable effects.
Ok, good disclaimer, but given what he wrote before, what's the mechanism which causes money printing to be inflationary at some point? It was rejected before but now it seems that it exists. What was even the point of the article?
Now, the problem is thinking monetarism on its extremes: without emission we have inflation sometimes, on others we have no inflation with emission, we know that if we have negative emission that doesn't guarantees us negative inflation, but that if emission is radically uncontrolled there will economic effects.
As I wrote above, that's not what monetarism (even on it's simpler form) says, nor a consequence of it. You can see some deviations in low inflation environment but it's not really Argentina's current situation.
Let's add other problems: the elastic question between money and prices is not evident. Neither is time lags in which can work or be neutral. So the question is the limit cases for monetarism which has some reason but some difficulty in explaining them: by which and it what moments rules work and in which it doesn't.
I find the time lag thing to be a red herring. You can observe empirically and not having a proper short/middle run model doesn't invalidate QTOM in the long run. While it may be that increasing interest rates or freezing MB is not effective, that's less a problem of the theory and more a problem of policy implementation.
Conclusion:
I find that the article doesn't truly get monetarism to begin with (see the points it makes about emission and money demand), neither how it's implemented in practice, nor seems to be aware of more modern theories that, while put money on the background, don't necessarily invalidate it (rational expectation ideas, and eventually New Keynesian stuff which addresses stuff like liquidity traps properly).
There are proper criticisms to be made to Friedman old ideas but he still was a relevant man in his time and the economic community has moved on to new, better theories that have some debt to it. I feel most economic discussion about monetarism in Argentina is a strawman of mainstream economics or an attack on Austrians more than genuine points ("monetarism" is used as a shorthand for those who think inflation is a monetary phenomenon more than referring to Friedman and his disciples per se).
submitted by Neronoah to badeconomics [link] [comments]

A (hopefully mathematically neutral) comparison of Lightning network fees to Bitcoin Cash on-chain fees.

A side note before I begin
For context, earlier today, sherlocoin made a post on this sub asking if Lightning Network transactions are cheaper than on-chain BCH transactions. This user also went on to complain on /bitcoin that his "real" numbers were getting downvoted
I was initially going to respond to his post, but after I typed some of my response, I realized it is relevant to a wider Bitcoin audience and the level of analysis done warranted a new post. This wound up being the longest post I've ever written, so I hope you agree.
I've placed the TL;DR at the top and bottom for the simple reason that you need to prepare your face... because it's about to get hit with a formidable wall of text.
TL;DR: While Lightning node payments themselves cost less than on-chain BCH payments, the associated overhead currently requires a LN channel to produce 16 transactions just to break-even under ideal 1sat/byte circumstances and substantially more as the fee rate goes up.
Further, the Lightning network can provide no guarantee in its current state to maintain/reduce fees to 1sat/byte.

Let's Begin With An Ideal World
Lightning network fees themselves are indeed cheaper than Bitcoin Cash fees, but in order to get to a state where a Lightning network fee can be made, you are required to open a channel, and to get to a state where those funds are spendable, you must close that channel.
On the Bitcoin network, the minimum accepted fee is 1sat/byte so for now, we'll assume that ideal scenario of 1sat/byte. We'll also assume the open and close is sent as a simple native Segwit transaction with a weighted size of 141 bytes. Because we have to both open and close, this 141 byte fee will be incurred twice. The total fee for an ideal open/close transaction is 1.8¢
For comparison, a simple transaction on the BCH network requires 226 bytes one time. The minimum fee accepted next-block is 1sat/byte. At the time of writing an ideal BCH transaction fee costs ~ 0.11¢
This means that under idealized circumstances, you must currently make at least 16 transactions on a LN channel to break-even with fees
Compounding Factors
Our world is not ideal, so below I've listed compounding factors, common arguments, an assessment, and whether the problem is solvable.
Problem 1: Bitcoin and Bitcoin Cash prices are asymmetrical.
Common arguments:
BTC: If Bitcoin Cash had the same price, the fees would be far higher
Yes, this is true. If Bitcoin Cash had the same market price as Bitcoin, our ideal scenario changes substantially. An open and close on Bitcoin still costs 1.8¢ while a simple Bitcoin Cash transaction now costs 1.4¢. The break-even point for a Lightning Channel is now only 2 transactions.
Is this problem solvable?
Absolutely.
Bitcoin Cash has already proposed a reduction in fees to 1sat for every 10 bytes, and that amount can be made lower by later proposals. While there is no substantial pressure to implement this now, if Bitcoin Cash had the same usage as Bitcoin currently does, it is far more likely to be implemented. If implemented at the first proposed reduction rate, under ideal circumstances, a Lightning Channel would need to produce around 13 transactions for the new break even.
But couldn't Bitcoin reduce fees similarly
The answer there is really tricky. If you reduce on-chain fees, you reduce the incentive to use the Lightning Network as the network becomes more hospitable to micropaments. This would likely increase the typical mempool state and decrease the Lightning Channel count some. The upside is that when the mempool saturates with low transaction fees, users are then re-incentivized to use the lightning network after the lowes fees are saturated with transactions. This should, in theory, produce some level of a transaction fee floor which is probably higher on average than 0.1 sat/byte on the BTC network.
Problem 2: This isn't an ideal world, we can't assume 1sat/byte fees
Common arguments:
BCH: If you tried to open a channel at peak fees, you could pay $50 each way
BTC: LN wasn't implemented which is why the fees are low now
Both sides have points here. It's true that if the mempool was in the same state as it was in December of 2017, that a user could have potentially been incentivized to pay an open and close channel fee of up to 1000 sat/byte to be accepted in a reasonable time-frame.
With that being said, two factors have resulted in a reduced mempool size of Bitcoin: Increased Segwit and Lightning Network Usage, and an overall cooling of the market.
I'm not going to speculate as to what percentage of which is due to each factor. Instead, I'm going to simply analyze mempool statistics for the last few months where both factors are present.
Let's get an idea of current typical Bitcoin network usage fees by asking Johoe quick what the mempool looks like.
For the last few months, the bitcoin mempool has followed almost the exact same pattern. Highest usage happens between 10AM and 3PM EST with a peak around noon. Weekly, usage usually peaks on Tuesday or Wednesday with enough activity to fill blocks with at least minimum fee transactions M-F during the noted hours and usually just shy of block-filling capacity on Sat and Sun.
These observations can be additionally evidenced by transaction counts on bitinfocharts. It's also easier to visualize on bitinfocharts over a longer time-frame.
Opening a channel
Under pre-planned circumstances, you can offload channel creation to off-peak hours and maintain a 1sat/byte rate. The primary issue arises in situations where either 1) LN payments are accepted and you had little prior knowledge, or 2) You had a previous LN pathway to a known payment processor and one or more previously known intermediaries are offline or otherwise unresponsive causing the payment to fail.
Your options are:
A) Create a new LN channel on-the-spot where you're likely to incur current peak fee rates of 5-20sat/byte.
B) Create an on-chain payment this time and open a LN channel when fees are more reasonable.
C) Use an alternate currency for the transaction.
There is a fundamental divide among the status of C. Some people view Bitcoin as (primarily) a storage of value, and thus as long as there are some available onramps and offramps, the currency will hold value. There are other people who believe that fungibility is what gives cryptocurrency it's value and that option C would fundamentally undermine the value of the currency.
I don't mean to dismiss either argument, but option C opens a can of worms that alone can fill economic textbooks. For the sake of simplicity, we will throw out option C as a possibility and save that debate for another day. We will simply require that payment is made in crypto.
With option B, you would absolutely need to pay the peak rate (likely higher) for a single transaction as a Point-of-Sale scenario with a full mempool would likely require at least one confirm and both parties would want that as soon as possible after payment. It would not be unlikely to pay 20-40 sat/byte on a single transaction and then pay 1sat/byte for an open and close to enable LN payments later. Even in the low end, the total cost is 20¢ for on-chain + open + close.
With present-day-statistics, your LN would have to do 182 transactions to make up for the one peak on-chain transaction you were forced to do.
With option A, you still require one confirm. Let's also give the additional leeway that in this scenario you have time to sit and wait a couple of blocks for your confirm before you order / pay. You can thus pay peak rates alone and not peak + ensure next block rates. This will most likely be in the 5-20 sat/byte range. With 5sat/byte open and 1sat/byte close, your LN would have to do 50 transactions to break even
In closing, fees are incurred by the funding channel, so there could be scenarios where the receiving party is incentivized to close in order to spend outputs and the software automatically calculates fees based on current rates. If this is the case, the receiving party could incur a higher-than-planned fee to the funding party.
With that being said, any software that allows the funding party to set the fee beforehand would avoid unplanned fees, so we'll assume low fees for closing.
Is this problem solvable?
It depends.
In order to avoid the peak-fee open/close ratio problem, the Bitcoin network either needs to have much higher LN / Segwit utilization, or increase on-chain capacity. If it gets to a point where transactions stack up, users will be required to pay more than 1sat/byte per transaction and should expect as much.
Current Bitcoin network utilization is close enough to 100% to fill blocks during peak times. I also did an export of the data available at Blockchair.com for the last 3000 blocks which is approximately the last 3 weeks of data. According to their block-weight statistics, The average Bitcoin block is 65.95% full. This means that on-chain, Bitcoin can only increase in transaction volume by around 50% and all other scaling must happen via increased Segwit and LN use.
Problem 3: You don't fully control your LN channel states.
Common arguments:
BCH: You can get into a scenario where you don't have output capacity and need to open a new channel.
BCH: A hostile actor can cause you to lose funds during a high-fee situation where a close is forced.
BTC: You can easily re-load your channel by pushing outbound to inbound.
BCH: You can't control whether nodes you connect to are online or offline.
There's a lot to digest here, but LN is essentially a 2-way contract between 2 parties. Not only does the drafting party pay the fees as of right now, but connected 3rd-parties can affect the state of this contract. There are some interesting scenarios that develop because of it and you aren't always in full control of what side.
Lack of outbound capacity
First, it's true that if you run out of outbound capacity, you either need to reload or create a new channel. This could potentially require 0, 1, or 2 additional on-chain transactions.
If a network loop exists between a low-outbound-capacity channel and yourself, you could push transactional capacity through the loop back to the output you wish to spend to. This would require 0 on-chain transactions and would only cost 1 (relatively negligible) LN fee charge. For all intents and purposes... this is actually kind of a cool scenario.
If no network loop exists from you-to-you, things get more complex. I've seen proposals like using Bitrefill to push capacity back to your node. In order to do this, you would have an account with them and they would lend custodial support based on your account. While people opting for trustless money would take issue in 3rd party custodians, I don't think this alone is a horrible solution to the LN outbound capacity problem... Although it depends on the fee that bitrefill charges to maintain an account and account charges could negate the effectiveness of using the LN. Still, we will assume this is a 0 on-chain scenario and would only cost 1 LN fee which remains relatively negligible.
If no network loop exists from you and you don't have a refill service set up, you'll need at least one on-chain payment to another LN entity in exchange for them to push LN capacity to you. Let's assume ideal fee rates. If this is the case, your refill would require an additional 7 transactions for that channel's new break-even. Multiply that by number of sat/byte if you have to pay more.
Opening a new channel is the last possibility and we go back to the dynamics of 13 transactions per LN channel in the ideal scenario.
Hostile actors
There are some potential attack vectors previously proposed. Most of these are theoretical and/or require high fee scenarios to come about. I think that everyone should be wary of them, however I'm going to ignore most of them again for the sake of succinctness.
This is not to be dismissive... it's just because my post length has already bored most casual readers half to death and I don't want to be responsible for finishing the job.
Pushing outbound to inbound
While I've discussed scenarios for this push above, there are some strange scenarios that arise where pushing outbound to inbound is not possible and even some scenarios where a 3rd party drains your outbound capacity before you can spend it.
A while back I did a testnet simulation to prove that this scenario can and will happen it was a post response that happened 2 weeks after the initial post so it flew heavily under the radar, but the proof is there.
The moral of this story is in some scenarios, you can't count on loaded network capacity to be there by the time you want to spend it.
Online vs Offline Nodes
We can't even be sure that a given computer is online to sign a channel open or push capacity until we try. Offline nodes provide a brick-wall in the pathfinding algorithm so an alternate route must be found. If we have enough channel connectivity to be statistically sure we can route around this issue, we're in good shape. If not, we're going to have issues.
Is this problem solvable?
Only if the Lightning network can provide an (effectively) infinite amount of capacity... but...
Problem 4: Lightning Network is not infinite.
Common arguments:
BTC: Lightning network can scale infinitely so there's no problem.
Unfortunately, LN is not infinitely scalable. In fact, finding a pathway from one node to another is roughly the same problem as the traveling salesman problem. Dijkstra's algorithm which is a problem that diverges polynomially. The most efficient proposals have a difficulty bound by O(n^2).
Note - in the above I confused the complexity of the traveling salesman problem with Dijkstra when they do not have the same bound. With that being said, the complexity of the LN will still diverge with size
In lay terms, what that means is every time you double the size of the Lightning Network, finding an indirect LN pathway becomes 4 times as difficult and data intensive. This means that for every doubling, the amount of traffic resulting from a single request also quadruples.
You can potentially temporarily mitigate traffic by bounding the number of hops taken, but that would encourage a greater channel-per-user ratio.
For a famous example... the game "6 degrees of Kevin Bacon" postulates that Kevin Bacon can be connected by co-stars to any movie by 6 degrees of separation. If the game is reduced to "4 degrees of Kevin Bacon," users of this network would still want as many connections to be made, so they'd be incentivized to hire Kevin Bacon to star in everything. You'd start to see ridiculous mash-ups and reboots just to get more connectivity... Just imagine hearing Coming soon - Kevin Bacon and Adam Sandlar star in "Billy Madison 2: Replace the face."
Is this problem solvable?
Signs point to no.
So technically, if the average computational power and network connectivity can handle the problem (the number of Lightning network channels needed to connect the world)2 in a trivial amount of time, Lightning Network is effectively infinite as the upper bound of a non-infinite earth would limit time-frames to those that are computationally feasible.
With that being said, BTC has discussed Lightning dev comments before that estimated a cap of 10,000 - 1,000,000 channels before problems are encountered which is far less than the required "number of channels needed to connect the world" level.
In fact SHA256 is a newer NP-hard problem than the traveling saleseman problem. That means that statistically, and based on the amount of review that has been given to each problem, it is more likely that SHA256 - the algorithm that lends security to all of bitcoin - is cracked before the traveling salesman problem is. Notions that "a dedicated dev team can suddenly solve this problem, while not technically impossible, border on statistically absurd.
Edit - While the case isn't quite as bad as the traveling salesman problem, the problem will still diverge with size and finding a more efficient algorithm is nearly as unlikely.
This upper bound shows that we cannot count on infinite scalability or connectivity for the lightning network. Thus, there will always be on-chain fee pressure and it will rise as the LN reaches it's computational upper-bound.
Because you can't count on channel states, the on-chain fee pressure will cause typical sat/byte fees to raise. The higher this rate, the more transactions you have to make for a Lightning payment open/close operation to pay for itself.
This is, of course unless it is substantially reworked or substituted for a O(log(n))-or-better solution.
Finally, I'd like to add, creating an on-chain transaction is a set non-recursive, non looping function - effectively O(1), sending this transaction over a peer-to-peer network is bounded by O(log(n)) and accepting payment is, again, O(1). This means that (as far as I can tell) on-chain transactions (very likely) scale more effectively than Lightning Network in its current state.
Additional notes:
My computational difficulty assumptions were based on a generalized, but similar problem set for both LN and on-chain instances. I may have overlooked additional steps needed for the specific implementation, and I may have overlooked reasons a problem is a simplified version requiring reduced computational difficulty.
I would appreciate review and comment on my assumptions for computational difficulty and will happily correct said assumptions if reasonable evidence is given that a problem doesn't adhere to listed computational difficulty.
TL;DR: While Lightning node payments themselves cost less than on-chain BCH payments, the associated overhead currently requires a LN channel to produce 16 transactions just to break-even under ideal 1sat/byte circumstances and substantially more as the fee rate goes up.
Further, the Lightning network can provide no guarantee in its current state to maintain/reduce fees to 1sat/byte.
submitted by CaptainPatent to btc [link] [comments]

AN INTRODUCTION TO DIGIBYTE

DigiByte

What are cryptocurrencies?
Cryptocurrencies are peer to peer technology protocols which rely on the block-chain; a system of decentralized record keeping which allows people to exchange unmodifiable and indestructible information “coins,” globally in little to no time with little to no fees – this translates into the exchange of value as these coins cannot be counterfeit nor stolen. This concept was started by Satoshi Nakamoto (allegedly a pseudonym for a single man or organization) whom described and coded Bitcoin in 2009.
What is DigiByte?
DigiByte (DGB) is a cryptocurrency like Bitcoin. It is also a decentralized applications protocol in a similar fashion to Neo or Ethereum.
DigiByte was founded and created by Jared Tate in 2014. DigiByte allows for fast (virtually instant) and low cost (virtually free) transactions. DigiByte is hard capped at 21 billion coins which will ever be mined, over a period of 21 years. DigiByte was never an ICO and was mined/created in the same way that Bitcoin or Litecoin initially were.
DigiByte is the fastest UTXO PoW scalable block-chain in the world. We’ll cover what this really means down below.
DigiByte has put forth and applied solutions to many of the problems that have plagued Bitcoin and cryptocurrencies in general – those being:
We will address these point by point in the subsequent sections.
The DigiByte Protocol
DigiByte maintains these properties through use of various technological innovations which we will briefly address below.
Why so many coins? 21 Billion
When initially conceived Bitcoin was the first of a kind! And came into the hands of a few! The beginnings of a coin such as Bitcoin were difficult, it had to go through a lot of initial growth pains which following coins did not have to face. It is for this reason among others why I believe Bitcoin was capped at 21 million; and why today it has thus secured a place as digital gold.
When Bitcoin was first invented no one knew anything about cryptocurrencies, for the inventor to get them out to the public he would have to give them away. This is how the first Bitcoins were probably passed on, for free! But then as interest grew so did the community. For them to be able to build something and create something which could go on to have actual value, it would have to go through a steady growth phase. Therefore, the control of inflation through mining was extremely important. Also, why the cap for Bitcoin was probably set so low - to allow these coins to amass value without being destroyed by inflation (from mining) in the same way fiat is today! In my mind Satoshi Nakamoto knew what he was doing when setting it at 21 million BTC and must have known and even anticipated others would take his design and build on top of it.
At DigiByte, we are that better design and capped at 21 billion. That's 1000 times larger than the supply of Bitcoin. Why though? Why is the cap on DigiByte so much higher than that of Bitcoin? Because DigiByte was conceived to be used not as a digital gold, nor as any sort of commodity, but as a real currency!
Today on planet Earth, we are approximately 7.6 billion people. If each person should want or need to use and live off Bitcoin; then equally split at best each person could only own 0.00276315789 BTC. The market cap for all the money on the whole planet today is estimated to have recently passed 80 trillion dollars. That means that each whole unit of Bitcoin would be worth approximately $3,809,523.81!
$3,809,523.81
This is of course in an extreme case where everyone used Bitcoin for everything. But even in a more conservative scenario the fact remains that with such a low supply each unit of a Bitcoin would become absurdly expensive if not inaccessible to most. Imagine trying to buy anything under a dollar!
Not only would using Bitcoin as an everyday currency be a logistical nightmare but it would be nigh impossible. For each Satoshi of a Bitcoin would be worth much, much, more than what is realistically manageable.
This is where DigiByte comes in and where it shines. DigiByte aims to be used world-wide as an international currency! Not to be hoarded in the same way Bitcoin is. If we were to do some of the same calculations with DigiByte we'd find that the numbers are a lot more reasonable.
At 7.6 billion people, each person could own 2.76315789474 DGB. Each whole unit of DGB would be worth approximately $3,809.52.
$3,809.52
This is much more manageable and remember in an extreme case where everyone used DigiByte for everything! I don't expect this to happen anytime soon, but with the supply of DigiByte it would allow us to live and transact in a much more realistic and fluid fashion. Without having to divide large numbers on our phone's calculator to understand how much we owe for that cup of coffee! With DigiByte it's simple, coffee cost 1.5 DGB, the cinema 2.8 DGB, a plane ticket 500 DGB!
There is a reason for DigiByte's large supply, and it is a good one!
Decentralisation
Decentralisation is an important concept for the block-chain and cryptocurrencies in general. This allows for a system which cannot be controlled nor manipulated no matter how large the organization in play or their intentions. DigiByte’s chain remains out of the reach of even the most powerful government. This allows for people to transact freely and openly without fear of censorship.
Decentralisation on the DigiByte block-chain is assured by having an accessible and fair mining protocol in place – this is the multi-algorithm (MultiAlgo) approach. We believe that all should have access to DigiByte whether through purchase or by mining. Therefore, DigiByte is minable not only on dedicated mining hardware such as Antminers, but also through use of conventional graphics cards. The multi-algorithm approach allows for users to mine on a variety of hardware types through use of one of the 5 mining algorithms supported by DigiByte. Those being:
Please note that these mining algorithms are modified and updated from time to time to assure complete decentralisation and thus ultimate security.
The problem with using only one mining algorithm such as Bitcoin or Litecoin do is that this allows for people to continually amass mining hardware and hash power. The more hash power one has, the more one can collect more. This leads to a cycle of centralisation and the creation of mining centres. It is known that a massive portion of all hash power in Bitcoin comes from China. This kind of centralisation is a natural tendency as it is cheaper for large organisations to set up in countries with inexpensive electricity and other such advantages which may be unavailable to the average miner.
DigiByte mitigates this problem with the use of multiple algorithms. It allows for miners with many different kinds of hardware to mine the same coin on an even playing field. Mining difficulty is set relative to the mining algorithm used. This allows for those with dedicated mining rigs to mine alongside those with more modest machines – and all secure the DigiByte chain while maintaining decentralisation.
Low Fees
Low fees are maintained in DigiByte thanks to the MultiAlgo approach working in conjunction with MultiShield (originally known as DigiShield). MultiShield calls for block difficulty readjustment between every single block on the chain; currently blocks last 15 seconds. This continuous difficulty readjustment allows us to combat any bad actors which may wish to manipulate the DigiByte chain.
Manipulation may be done by a large pool or a single entity with a great amount of hash power mining blocks on the chain; thus, increasing the difficulty of the chain. In some coins such as Bitcoin or Litecoin difficulty is readjusted every 2016 blocks at approximately 10mins each and 2mins respectively. Meaning that Bitcoin’s difficulty is readjusted about every two weeks. This system can allow for large bad actors to mine a coin and then abandon it, leaving it with a difficulty level far too high for the present hash rate – and so transactions can be frozen, and the chain stopped until there is a difficulty readjustment and or enough hash power to mine the chain. In such a case users may be faced with a choice - pay exorbitant fees or have their transactions frozen. In an extreme case the whole chain could be frozen completely for extended periods of time.
DigiByte does not face this problem as its difficulty is readjusted per block every 15 seconds. This innovation was a technological breakthrough and was adopted by several other coins in the cryptocurrency environment such as Dogecoin, Z-Cash, Ubiq, Monacoin, and Bitcoin Gold.
This difficulty readjustment along with the MultiAlgo approach allows DigiByte to maintain the lowest fees of any UTXO – PoW – chain in the world. Currently fees on the DigiByte block-chain are at about 0.0001 DGB per transaction of 100 000 DGB sent. This depends on the amount sent and currently 100 000 DGB are worth around $2000.00 with the fee being less than 0.000002 cents. It would take 500 000 transactions of 100 000 DGB to equal 1 penny’s worth. This was tested on a Ledger Nano S set to the low fees setting.
Fast transaction times
Fast transactions are ensured by the conjunctive use of the two aforementioned technology protocols. The use of MultiShield and MultiAlgo allows the mining of the DigiByte chain to always be profitable and thus there is always someone mining your transactions. MultiAlgo allows there to a greater amount of hash power spread world-wide, this along with 15 second block times allows for transactions to be near instantaneous. This speed is also ensured by the use DigiSpeed. DigiSpeed is the protocol by which the DigiByte chain will decrease block timing gradually. Initially DigiByte started with 30 second block times in 2014; which today are set at 15 seconds. This decrease will allow for ever faster and ever more transactions per block.
Robust security + The Immutable Ledger
At the core of cryptocurrency security is decentralisation. As stated before decentralisation is ensured on the DigiByte block chain by use of the MultiAlgo approach. Each algorithm in the MultiAlgo approach of DigiByte is only allowed about 20% of all new blocks. This in conjunction with MultiShield allows for DigiByte to be the most secure, most reliable, and fastest UTXO block chain on the planet. This means that DigiByte is a proof of work (PoW) block-chain where all transactional activities are stored on the immutable public ledger world-wide. In DigiByte there is no need for the Lightning protocol (although we have it) nor sidechains to scale, and thus we get to keep PoW’s security.
There are many great debates as to the robustness or cleanliness of PoW. The fact remains that PoW block-chains remain the only systems in human history which have never been hacked and thus their security is maximal.
For an attacker to divert the DigiByte chain they would need to control over 93% of all the hashrate on one algorithm and 51% of the other four. And so DigiByte is immune to the infamous 51% attack to which Bitcoin and Litecoin are vulnerable.
Moreover, the DigiByte block-chain is currently spread over 200 000 plus servers, computers, phones, and other machines world-wide. The fact is that DigiByte is one of the easiest to mine coins there is – this is greatly aided by the recent release of the one click miner. This allows for ever greater decentralisation which in turn assures that there is no single point of failure and the chain is thus virtually un-attackable.
On Chain Scalability
The biggest barrier for block-chains today is scalability. Visa the credit card company can handle around 2000 transactions per second (TPS) today. This allows them to ensure customer security and transactional rates nation-wide. Bitcoin currently sits at around 7 TPS and Litecoin at 28 TPS (56 TPS with SegWit). All the technological innovations I’ve mentioned above come together to allow for DigiByte to be the fastest PoW block-chain in the world and the most scalable.
DigiByte is scalable because of DigiSpeed, the protocol through which block times are decreased and block sizes are increased. It is known that a simple increase in block size can increase the TPS of any block-chain, such is the case with Bitcoin Cash. This is however not scalable. The reason a simple increase in block size is not scalable is because it would eventually lead to some if not a great amount of centralization. This centralization occurs because larger block sizes mean that storage costs and thus hardware cost for miners increases. This increase along with full blocks – meaning many transactions occurring on the chain – will inevitably bar out the average miner after difficulty increases and mining centres consolidate.
Hardware cost, and storage costs decrease over time following Moore’s law and DigiByte adheres to it perfectly. DigiSpeed calls for the increase in block sizes and decrease in block timing every two years by a factor of two. This means that originally DigiByte’s block sizes were 1 MB at 30 seconds each at inception in 2014. In 2016 DigiByte increased block size by two and decreased block timing by the same factor. Perfectly following Moore’s law. Moore’s law dictates that in general hardware increases in power by a factor of two while halving in cost every year.
This would allow for DigiByte to scale at a steady rate and for people to adopt new hardware at an equally steady rate and reasonable expense. Thus so, the average miner can continue to mine DigiByte on his algorithm of choice with entry level hardware.
DigiByte was one of the first block chains to adopt segregated witness (SegWit in 2017) a protocol whereby a part of transactional data is removed and stored elsewhere to decrease transaction data weight and thus increase scalability and speed. This allows us to fit more transactions per block which does not increase in size!
DigiByte currently sits at 560 TPS and could scale to over 280 000 TPS by 2035. This dwarfs any of the TPS capacities; even projected/possible capacities of some coins and even private companies. In essence DigiByte could scale worldwide today and still be reliable and robust. DigiByte could even handle the cumulative transactions of all the top 50 coins in coinmarketcap.com and still run smoothly and below capacity. In fact, to max out DigiByte’s actual maximum capacity (today at 560 TPS) you would have to take all these transactions and multiply them by a factor of 10!
Oher Uses for DigiByte
Note that DigiByte is not only to be used as a currency. Its immense robustness, security and scalability make it ideal for building decentralised applications (DAPPS) which it can host. DigiByte can in fact host DAPPS and even centralised versions which rely on the chain which are known as Digi-Apps. This application layer is also accompanied by a smart contract layer.
Thus, DigiByte could host several Crypto Kitties games and more without freezing out or increasing transaction costs for the end user.
Currently there are various DAPPS being built on the DigiByte block-chain, these are done independently of the DigiByte core team. These companies are simply using the DigiByte block-chain as a utility much in the same way one uses a road to get to work. One such example is Loly – a Tinderesque consensual dating application.
DigiByte also hosts a variety of other platform projects such as the following:
The DigiByte Foundation
As previously mentioned DigiByte was not an ICO. The DigiByte foundation was established in 2017 by founder Jared Tate. Its purpose is as a non-profit organization dedicated to supporting and developing the DigiByte block-chain.
DigiByte is a community effort and a community coin, to be treated as a public resource as water or air. Know that anyone can work on DigiByte, anyone can create, and do as they wish. It is a permissionless system which encourages innovation and creation. If you have an idea and or would like to get help on your project do not hesitate to contact the DigiByte foundation either through the official website and or the telegram developer’s channel.
For this reason, it is ever more important to note that the DigiByte foundation cannot exist without public support. And so, this is the reason I encourage all to donate to the foundation. All funds are used for the maintenance of DigiByte servers, marketing, and DigiByte development.
DigiByte Resources and Websites
DigiByte
Wallets
Explorers
Please refer to the sidebar of this sub-reddit for more resources and information.
Edit - Removed Jaxx wallet.
Edit - A new section was added to the article: Why so many coins? 21 Billion
Edit - Adjusted max capacity of DGB's TPS - Note it's actually larger than I initially calculated.
Edit – Grammar and format readjustment
Hello,
I hope you’ve enjoyed my article, I originally wrote this for the reddit sub-wiki where it generally will most likely, probably not, get a lot of attention. So instead I've decided to make this sort of an introductory post, an open letter, to any newcomers to DGB or for those whom are just curious.
I tried to cover every aspect of DGB, but of course I may have forgotten something! Please leave a comment down below and tell me why you're in DGB? What convinced you? Me it's the decentralised PoW that really convinced me. Plus, just that transaction speed and virtually no fees! Made my mouth water!
-Dereck de Mézquita
I'm a student typing this stuff on my free time, help me pay my debts? Thank you!
D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
https://digiexplorer.info/address/D64fAFQvJMhrBUNYpqUKQjqKrMLu76j24g
submitted by xeno_biologist to Digibyte [link] [comments]

RaiBlocks AMA Summary!

I posted this under /cryptocurrency and /cryptomarkets as well! Might be less useful under this subreddit... but I'm using it for purposes of helping people become aware of this coin.
Summation of RaiBlocks lead developer AMA. I'm very excited about this coin, and if you're asking why I did this...I'm trying out my AMA consolidating script that I wrote for fun :) I'm interested in seeing what people think about this coin! You can read the responses directly from this link: https://www.reddit.com/RaiBlocks/comments/7ko5l7/colin_lemahieu_founder_and_lead_developer_of/
 
What are your top priorities atm? Both in developing areas itself and in terms of integration?
 
The top priorities right now are:
These basically need to happen in a sequence because each item isn't useful unless the previous one is complete.
 
 
Do you have any plans to have your source code peer reviewed? By peer review I mean sending your source code down to MIT for testing and review.
Where do you see Raiblocks 5-10 years from now? (For instance do you envision people using a Raiblocks mobile phone app to transfer value between each other, or buy stuff at the store?
 
We definitely need peer and code reviews and we're open to anyone doing this. We have ideas for people in universities that want to analyze the whitepaper or code so we'll see what comes of that. In my opinion code security guarantees can only be given with (eyes * time) and we need both.
I'd like to see RaiBlocks adopted as an internet RFC and basically become an ubiquitous background technology like http. I think you're probably right and a mobile app would be the most user-friendly way to do this so people don't need to carry around extra cards in their wallet etc.
 
 
Is there a list of the team readily available? Are there firm plans to expand, and if so, in which directions?
The roadmap indicated a website redesign scheduled for November 2017. Is there an update?
 
We have about 12 people in the core team; about half are code and half are business developers. On the redesigned website we're going to include bios for sure, no one in our team is anonymous. I think we have pretty good coverage of what we need right now, we could always use more people capable of contributing to the core code.
The website design is well underway, we wanted to streamline and add some more things to it so it took longer than originally estimated. It'll looking like after the new year we'll have it ready.
 
 
Would you ever consider renaming the coin to simply "Rai" or any other simplified form other than RaiBlocks?
2. What marketing strategy do you think will push XRB forward from now on as a fully working product. Instant and free, the green coin, "it just works" coin, etc?
3. Regarding security, is "quantum-proofing" a big concern at the moment and how do you guys plan to approach this when the time comes. And how possible would it be for bad actors to successfully implement a 51% attack.
 
  1. Yea there are a few difficulties people have pointed out with our name. People don't know if it's "ray" or "rye". "Blocks" doesn't have a meaning to a lot of people and the name reference might be too esoteric to be meaningful. I'm not prideful so I'm not stuck on a particular name, we'll take a look at what our marketing and business developers say peoples' impressions are and if they have any naming recommendations.
  2. Our marketing strategy is to focus on complete simplicity. Instant and free resonates with enthusiasts and mass adoption will only come when using xrb is absolutely the same experience as using a banking or other payment app. People aren't going to tolerate jargon or confusing workflows when sending or receiving payments.
  3. Quantum computing is going to be an amazing leap for humanity but it's also going to cause a lot of flux in cryptography. The plan I see is the similar to what I did in selecting the cryptographic algorithms we're using right now: look for leaders in academia and industry that have proven implementations and use those as they recommend migration based on computing capability. Quantum vulnerabilities can be an issue in the future but a vulnerable implementation would be an issue right now.
 
 
Hi Colin, lately XRB has been getting frequently compared to and contrasted with Iota. I was hoping that you could give us your thoughts on the differences between the two and what your general vision for the future of Raiblocks is.
 
It's flattering to be compared to IOTA, they have a very talented team building ambitious technology. When looking at design goals I think one thing we're not attempting to approach is transferring a data payload, we're only looking to be a transfer of value.
There are lots of ideas and technology to be developed in the cryptocurrency space and I want RaiBlocks to solve one section of that industry: the transfer of value. I think the best success would be if RaiBlocks was adopted as the global standard for this and crypto efforts could move to non-value-transfer use-cases.
 
 
Do you see XRB becoming the new payment method for commerce. As in, buying coffee, groceries, etc? Do you have plans for combating the HODL mentality so this currency can actually be used in the future of buying and selling?
 
Being a direct transactional payment method is our goal and we're trying to build software that's accessible to everyone to make that happen. I see holding as a speculative tactic anticipating future increases and you're right, it's not in line with day-to-day transactions. I think as market cap levels off to a more consistent value the reason for holding and speculating goes away and people can instead focus on using it as a value exchange.
 
 
Are you planning to expand the RaiBlocks team over the next 12 months? If so, what types of positions are you hoping to fill?
 
Right now we have about 12 people, half core and half business developers. I think this count is good for working on what we're doing right now which is getting wallets and exchanges worked on. Ideally people outside our team will start developing technology around xrb taking advantage of the network effect to build more technology faster than we could internally. That being said we're going to look in a few months to see if there's anything out there people aren't developing that should be and we'll see what people we need to make it happen.
 
 
At what point did you make the decision to make RaiBlocks your full time job? What was the decision making process like?
 
It was after the week where the core team met here in Austin to brainstorm our next steps. I saw how much enthusiasm there was from crypto-veterans with having a working system capable of being scaled up to what's needed for massive adoption and it seemed the risk needed to be taken.
It was hard decision to make, working in the crypto and finance is rough and I like using my leisure time to work on inventions. Of all the projects ideas I have this one seemed to have a high chance of success and the benefits of having a working, decentralized currency would be huge.
 
 
Hi Colin, what prevents great cryptos like XRB from being listed on bigger exchanges?
 
It's good to understand where the biggest headaches for exchanges lie: support tickets, operations, and development. If a technology is different from what they already have, that takes development time. If the software is new and not widely run, that's potential operations time to fix it which results in support tickets and community backlash. Adding BitCoin clones or Ethereum ICO coins is easy because they don't have these associated risks or costs.
 
 
What can the average RaiBlocks-Fan do to help XRB getting adopted / growing / expanding?
 
I think the best thing an average fan could do is word of mouth and telling people about RaiBlocks. More people being aware of it means there's the possibility someone who's never heard of it before would be interested in contributing as a vendor, developer, exchange etc.
Good advertising or marketing will never be able to reach everyone as well as someone reaching out within their own network.
 
 
Ray or Rye?
 
Ray hehe. It comes from https://en.wikipedia.org/wiki/Rai_stones Lots of people don't know the answer though >_<
 
 
Are you looking at incorperating a datamarket like iota in the future? Given the speed of the network a data exchange for highly accurate sensors could be a game changer.
Further more, are there any plans to increase the Dev team in the future? I read on the FAQ you'd like RaiBlocks to be somewhat of a protocol which is a huge ambition. A Dev from say the Mozilla foundation or other could further cement this ambitious project.
 
Transmitting data payloads is something we probably won't pursue. The concern is adding more features like this could cause us to make decisions that compromise the primary focus points of low-cost and speed for transferring value.
We can add people to the dev team though I think we'll get the most traction by teaching teams in these other organization how to use RaiBlocks so they can be the experts on the subject in their companies.
 
 
Does the actual RaiBlocks version require "Each node in the network must be aware of all transactions as they occur" part? This was in the old white paper and is asked here:
https://www.reddit.com/RaiBlocks/comments/7ksl81/some_questions_regarding_raiblocks_consensus/?st=jbdmgagc&sh=d1c93cca
 
If a node wants to independently know the balances of all accounts in the system, it must at a minimum have storage to hold accounts and all their balances. In order to know all balances it must either listen to transactions as they're happening or bootstrap from someone else to catch up as what happens on startup.
 
 
There is no incentive to run nodes. Some people will do it because it is cheap as fuck (as I read an raspberry pie can run it). But I think not many people will do it.
1. How important are the nodes in terms of further scaling?
2. On which network conditions where the 7000 transactions met?
3. What happens if the transactions per day tenfolds but the nodes don't?
4. How much better will Rai scale if someone sets up, lets say, 100 nodes with awesome hardware and network?
5. How many nodes could be enough for visa level scaling?
6. Which further improvements can be made for Rai IF there needs to be other improvements than setting up new nodes? Are there other concepts like 2nd layer solutions planned?
7. How will Rai defend network attacks?
I know there is an PoW part. But since there a also large attacks on high cap coins on which people invest millions of $ to congest a network..Is it possible that the Rai network will be unusable for several days because of this?
 
I think the out-of-protocol incentives to running a node are under-referenced yet I see them as the primary driving factor for participating as a whole. Node rewards come at the expense of other network participants and in this closed loop the incentives aren't enough to keep a cryptocurrency alive. Long-term there needs to be a system-level comparative advantage to what people are already using for a transfer of value. If someone is using xrb and it saves them hundreds or thousands of dollars per month in fees and customer irritation in delayed payments, they have a direct monetary incentive to using xrb and a monetary incentive in the health of the system.
1) More nodes provides transaction and bootstrapping redundancy. More representatives provides decentralization.
2) The 7k TPS was a profile how fast commodity hardware could eat transactions. All of the real-world limits are going to be something hardware related, either bandwidth, IO, or CPU.
3) The scaling is more related to the hardware the nodes are using rather than the node count. If there was 10x increase in transactions it would use 10x the bandwidth and IO as nodes observe transactions happening.
4) If someone made 100 representative nodes the network would be far more decentralized though the tx throughput would be unchanged since that's a per-node requirement.
5) Scaling to Visa will have high bandwidth and IO requirements on representatives associated with doing 10k IOPS. Datacenter and business class hardware will have to be enough to handle the load.
6) Second layer solutions are always an option and I think a lot of people will use them for fraud protection and insurance. Our primary focus is to make the 1st layer as efficient and high speed as possible so a 2nd layer isn't needed for daily transactions.
7) Defending against network attacks will be an ongoing thing, people like breaking the network for lulz or monetary gain i.e. competing cryptos. If there are attacks we haven't defended against or considered it'll be a matter of getting capable people to fix issues.
 
 
Are you open to changes to the name? (Rai)
What are your plans with regards to marketing?
 
I'm open to it, people get confused on ray/rye pronunciation, not the greatest first impression.
As far as timing I think marketing works best after a more user friendly wallet and integration in to more exchanges otherwise we're sending traffic to something people can't use. We're going to start by focusing on the initial adopters which will likely be enthusiasts and going forward work on the next set of users that aren't enthusiasts but want to drive savings for their business through lower payment processing costs.
 
 
A recent tweet(https://twitter.com/VitalikButerin/status/942961006614945792) from Vitalik Buterin. Could this be a case with testing the scalability of RaiBlocks as well and in reality we wouldn't come close to 7000tx/s?
 
I think he's definitely right, a lot of the TPS numbers are synthetic benchmarks usually on one system. The biggest thing hindering TPS are protocol-specific limits like hard caps or high contention design. The next biggest thing will be bandwidth and then disk IO. Some of these limits can be improved by profiling and fixing code instead of actual limits in the hardware.
We want to get better, real world numbers but our general opinion is that the RaiBlocks protocol is going to be limited by hardware, rather than design.
 
 
Are you planning to add a fiat gateway to the main website and mobile wallet?
 
If we can make it happen for sure, that seems like a very user-focused feature people would want.
The difficulty at least in the US is the money-transmitter licenses which are hard to obtain. More than likely if this functionality was added it'd be a partnership with an established financial company that has procedures in place to operate within countries' regulations.
 
 
I saw a post on /iota that claims that their quantum resistance is a main benefit over raiblocks. Can you go into detail about this? explain any plans you have to let XRB persevere through upcoming quatum revolution?
 
I think everyone with cryptography in their programs is keeping an eye on quantum cryptography because we're all in the same boat. I don't have cryptanalysis credentials so I didn't feel comfortable building an implementation and instead chose to use one off-the-shelf from someone with assuring credentials.
There are some big companies that have made small mistakes that blow up the usefulness of the entire algorithm, it's incredibly easy to do. https://arstechnica.com/gaming/2010/12/ps3-hacked-through-poor-implementation-of-cryptography/
 
 
Hello Colin, is any security audit to the source code planned?
 
We don't have one contracted though both internally and externally this is an important thing people want completed.
 
 
Do you have plans to radically change the interface of the desktop wallet, and to develop a universal, cross-platform, clean and simple UX design for the wallet? This will be huge for mass adoption in my humble opinion
 
I completely agree, we do plan on completely redoing the desktop wallet, both from a UX standpoint and maintainability so UI code doesn't need to be in C++. This could also remove out dependency on QT which is the least permissive license in the code right now.
I write code better than I design GUIs ;)
 
 
It seems like Raiblocks is aiming to be a true currency with it's lacking of transaction fees and fast confirmation times, which is great! If Raiblocks can add some kind of support for privacy then I think it got the whole picture figured out in terms of being "digital cash". Do you currently have any plans to implement privacy features into RaiBlocks?
If Raiblocks is unable to do this, it will still be a straight improvement over things like LTC which are currently being used as currency, but I don't think it will be able to become THE cryptocurrency without privacy features.
 
I love the concept of privacy in the network and it's a hard thing to do right. Any solution used would need to be compatible with our balance-weighted-voting method which means at least we'd have to know how much weight a representative has even if we're hiding actual account balances.
To be fully anonymous it would have to be hide accounts, amounts, endpoints, and also timing information; with advanced network analysis the timing is the hardest thing to hide. Hopefully some day we can figure out an efficient privacy solution though the immediate problem we can solve is making a transactional cryptocurrency so we're focusing on that.
 
 
Could you provide an analysis on the flaws of RaiBlocks? Is it in any way, shape, or form at a disadvantage compared to a blockchain based ledger like bitcoin? There has to be drawbacks, but I haven’t found any.
Do you plan on expanding the dev team and establishing a foundation? Also, how much money is in the development pool?
 
One drawback is to handle is our chain-per-account model and asynchronous updates it takes more code and design. This means instead of one top-block hash for everything there's one for each account. This gives us the power of wait-free asynchronous transactions at the cost of simplicity.
After we finish up things like the wallet, website, and exchange integration we'll be looking at seeing what dev resources we need to build tech if no one else is already working on a particular thing. We have about 6 million XRB right now so we've made the existing dev funds go a long way. If something expensive to build came along and dev funds wouldn't cut it we could look at some sort of external funding.
 
 
How big of a problem is PoW for exchanges and what are potential solutions?
 
Considering how much exchanges stand to make through commission I don't see the cost as a barrier, it's just an abnormal technology request compared to other cryptocurrencies.
We're working on providing a service exchanges can use in the interim until they set up their own infrastructure to generate the work. Other options are containers people can use on cloud services to get the infrastructure they need until they want to invest in their own.
 
 
It's my understanding that since everything works asynchronously, in the case of double spending there is a chance a merchant would receive the block that would be later invalidated and have it shown in it's wallet, even if a little later (1 minute?) the amount would correct when the delegates vote that block invalid. Is there any mechanism to avoid this? Maybe tag the transactions in the wallet as "confirming" and then "confirmed" after that minute? Is there actually any certain way for a wallet to know, in a deterministic/programable way, at what moment a transaction is 100% legit? (for example if the delegates are DoS'ed I guess that minute could be much longer). I know this is an improbable case, but still...
 
Yea you're hitting a good point, the consensus algorithm in the node is designed to wait for the incoming transaction to settle before accepting it in to the local chain for the exact reason you listed, if their transaction were to be rolled back the local account would be rolled back as well.
We can trend the current weight of all representatives that are online and voting and make sure we have >50% of the vote weight accounted for before considering it settled.
 
 
Hey Colin, will you eventually have support for a Trezor or other hard wallet?
 
Yea we'll definitely work with companies like Trezor that are interested in being a hardware wallet for xrb. It's just a matter of making sure they support the signing algorithms and integrating with their API.
 
EDIT: I'm getting a lot of messages asking me how to buy XRB. I used this guide which was very helpful: https://www.reddit.com/RaiBlocks/comments/7i0co0/the_definitive_guide_to_buying_and_storing/
In short -- buy BTC on coinbase, open up an account on bitgrail, transfer that BTC from coinbase to bitgrail, then trade your BTC for XRB. It's a pain right now because it's such a new coin, but soon it will be listed on more exchanges, and hopefully on things like shapeshift/changelly. After that it will be much easier... but until then, the inconvenience is what we have to pay in order to get into XRB while its still early.
EDIT: BAD SCRIPT, BAD!
submitted by atriaxx to RaiBlocks [link] [comments]

ColossusXT Q2 AMA Ends!

Thank you for being a part of the ColossusXT Reddit AMA! Below we will summarize the questions and answers. The team responded to 78 questions! If you question was not included, it may have been answered in a previous question. The ColossusXT team will do a Reddit AMA at the end of every quarter.
The winner of the Q2 AMA Contest is: Shenbatu
Q: Why does your blockchain exist and what makes it unique?
A: ColossusXT exists to provide an energy efficient method of supercomputing. ColossusXT is unique in many ways. Some coins have 1 layer of privacy. ColossusXT and the Colossus Grid will utilize 2 layers of privacy through Obfuscation Zerocoin Protocol, and I2P and these will protect users of the Colossus Grid as they utilize grid resources. There are also Masternodes and Proof of Stake which both can contribute to reducing 51% attacks, along with instant transactions and zero-fee transactions. This protection is paramount as ColossusXT evolves into the Colossus Grid. Grid Computing will have a pivotal role throughout the world, and what this means is that users will begin to experience the Internet as a seamless computational universe. Software applications, databases, sensors, video and audio streams-all will be reborn as services that live in cyberspace, assembling and reassembling themselves on the fly to meet the tasks at hand. Once plugged into the grid, a desktop machine will draw computational horsepower from all the other computers on the grid.
Q: What is the Colossus Grid?
A: ColossusXT is an anonymous blockchain through obfuscation, Zerocoin Protocol, along with utilization of I2P. These features will protect end user privacy as ColossusXT evolves into the Colossus Grid. The Colossus Grid will connect devices in a peer-to-peer network enabling users and applications to rent the cycles and storage of other users’ machines. This marketplace of computing power and storage will exclusively run on COLX currency. These resources will be used to complete tasks requiring any amount of computation time and capacity, or allow end users to store data anonymously across the COLX decentralized network. Today, such resources are supplied by entities such as centralized cloud providers which are constrained by closed networks, proprietary payment systems, and hard-coded provisioning operations. Any user ranging from a single PC owner to a large data center can share resources through Colossus Grid and get paid in COLX for their contributions. Renters of computing power or storage space, on the other hand, may do so at low prices compared to the usual market prices because they are only using resources that already exist.
Q: When will zerocoin be fully integrated?
A: Beta has been released for community testing on Test-Net. As soon as all the developers consider the code ready for Main-Net, it will be released. Testing of the code on a larger test network network will ensure a smooth transition.
Q: Is the end goal for the Colossus Grid to act as a decentralized cloud service, a resource pool for COLX users, or something else?
A: Colossus Grid will act as a grid computing resource pool for any user running a COLX node. How and why we apply the grid to solve world problems will be an ever evolving story.
Q: What do you think the marketing role in colx.? When ll be the inwallet shared nodes available...i know its been stated in roadmap but as u dont follow roadmap and offer everything in advance...i hope shared MN's to be avilable soon.
A: The ColossusXT (COLX) roadmap is a fluid design philosophy. As the project evolves, and our community grows. Our goal is to deliver a working product to the market while at the same time adding useful features for the community to thrive on, perhaps the Colossus Grid and Shared Masternodes will be available both by the end of Q4 2018.
Q: When will your github be open to the public?
A: The GitHub has been open to the public for a few months now.
You can view the GitHub here: https://github.com/ColossusCoinXT
The latest commits here: https://github.com/ColossusCoinXT/ColossusCoinXT/commits/master
Q: Why should I use COLX instead of Monero?
A: ColossusXT offers Proof of Stake and Masternodes both which contribute layers in protection from 51% attacks often attributed with Proof of Work consensus, and in being Proof of Work(Monero) ColossusXT is environmentally friendly compared to Proof of Work (Monero). You can generate passive income from Proof of Stake, and Masternodes. Along with helping secure the network.What really sets ColossusXT apart from Monero, and many other privacy projects being worked on right now, is the Colossus Grid. Once plugged into the Colossus Grid, a desktop machine will draw computational horsepower from all the other computers on the grid. Blockchain, was built on the core value of decentralization and ColossusXT adhere to these standards with end-user privacy in mind in the technology sector.
Q: With so many coins out with little to no purpose let alone a definitive use case, how will COLX distinguish itself from the crowd?
A: You are right, there are thousands of other coins. Many have no purpose, and we will see others “pumping” from day to day. It is the nature of markets, and crypto as groups move from coin to coin to make a quick profit. As blockchain regulations and information is made more easily digestible projects like ColossusXT will rise. Our goal is to produce a quality product that will be used globally to solve technical problems, in doing so grid computing on the ColossusXT network could create markets of its own within utilizing Super-computing resources. ColossusXT is more than just a currency, and our steadfast approach to producing technical accomplishments will not go unnoticed.
Q: Tell the crowd something about the I2P integration plan in the roadmap? 🙂
A: ColossusXT will be moving up the I2P network layer in the roadmap to meet a quicker development pace of the Colossus Grid. The I2P layer will serve as an abstraction layer further obfuscating the users of ColossusXT (COLX) nodes. Abstraction layer allows two parties to communicate in an anonymous manner. This network is optimised for anonymous file-sharing.
Q: What kind of protocols, if any, are being considered to prevent or punish misuse of Colossus Grid resources by bad actors, such as participation in a botnet/denial of service attack or the storage of stolen information across the Grid?
A: What defines bad actors? ColossusXT plans on marketing to governments and cyber security companies globally. Entities and individuals who will certainly want their privacy protected. There is a grey area between good and bad, and that is something we can certainly explore as a community. Did you have any ideas to contribute to this evolving variable?What we mean when we say marketing towards security companies and governments is being utilized for some of the projects and innovating new ways of grid computing.
Security: https://wiki.ncsa.illinois.edu/display/cybersec/Projects+and+Software
Governments: https://www.techwalla.com/articles/what-are-the-uses-of-a-supercomputer
Q: The Colossus Grid is well defined but I don't feel easily digestible. Has their been any talk of developing an easier to understand marketing plan to help broaden the investoadoptor base?
A: As we get closer to the release of the Colossus Grid marketing increase for the Colossus Grid. It will have a user friendly UI, and we will provide Guides and FAQ’s with the release that any user intending to share computing power will be able to comprehend.
Q: Can you compare CollossusXT and Golem?
A: Yes. The Colosssus Grid is similar to other grid computing projects. The difference is that ColossusXT is on it’s own blockchain, and does not rely on the speed or congestion of a 3rd party blockchain. The Colossus Grid has a privacy focus and will market to companies, and individuals who would like to be more discreet when buying or selling resources by offering multiple levels of privacy protections.
Q: How do you guys want to achieve to be one of the leaders as a privacy coin?
A: Being a privacy coin leader is not our end game. Privacy features are just a small portion of our framework. The Colossus Grid will include privacy features, but a decentralized Supercomputer is what will set us apart and we intend to be leading this industry in the coming years as our vision, and development continue to grow and scale with technology.
Q: With multiple coins within this space, data storage and privacy, how do you plan to differentiate COLX from the rest? Any further partnerships planned?
A: The Colossus Grid will differentiate ColossusXT from coins within the privacy space. The ColossusXT blockchain will differentiate us from the DATA storage space. Combining these two features with the ability to buy and sell computing power to complete different computational tasks through a decentralized marketplace. We intend to involve more businesses and individuals within the community and will invite many companies to join in connecting the grid to utilize shared resources and reduce energy waste globally when the BETA is available.
Q: Has colossus grid had the best come up out of all crypto coins?
A: Possibly. ColossusXT will continue to “come up” as we approach the launch of the Colossus Grid network.
Q: How far have Colossus gone in the ATM integration
A: ColossusXT intends to and will play an important role in the mass adoption of cryptocurrencies. We already have an ongoing partnership with PolisPay which will enable use of COLX via master debit cards. Along with this established relationship, ColossusXT team is in touch with possible companies to use colx widely where these can only be disclosed upon mutual agreement.
Q: How does COLX intend to disrupt the computing industry through Grid Computing?
A: Using the Colossus Grid on the ColossusXT blockchain, strengthens the network. Computers sit idly by for huge portions of the day. Connecting to the Colossus Grid and contributing those idle resources can make use of all the computing power going to waste, and assist in advancing multiple technology sectors and solving issues. Reducing costs, waste, and increased speed in technology sectors such as scientific research, machine learning, cyber security, and making it possible for anyone with a desktop PC to contribute resources to the Colossus Grid and earn passive income.
Q: What kind of partnerships do you have planned and can you share any of them? :)
A: The ColossusXT team will announce partnerships when they are available. It’s important to finalize all information and create strong avenues of communication between partners ColossusXT works with in the future. We are currently speaking with many different exchanges, merchants, and discussing options within our technology sector for utilizing the Colossus Grid.
Q: Will shared Masternodes be offered by the COLX team? Or will there be any partnerships with something like StakingLab, StakeUnited, or SimplePosPool? StakingLab allows investors of any size to join their shared Masternodes, so any investor of any size can join. Is this a possibility in the future?
A: ColossusXT has already partnered with StakingLab. We also plan to implement shared Masternodes in the desktop wallet.
Q: How innovative is the Colossus Grid in the privacy coin space?
A: Most privacy coins are focused on being just a currency / form of payment. No other project is attempting to do what we are doing with a focus on user privacy.
Q: Hey guys do you think to integrated with some other plataforms like Bancor? I would like it!
A: ColossusXT is in touch with many exchange platforms, however, due to non disclosure agreements details cannot be shared until it is mutually decided with the partners. We will always be looking for new platforms to spread the use of colx in different parts of the world and crypto space.
Q: What is the reward system for the master node owners?
A: From block 388.800 onwards, block reward is 1200 colx and this is split based on masternode ownestaker ratio. This split is based on see-saw algorithm. With an increasing number of masternodes the see-saw algorithm disincentivizes the establishment of even more masternodes because it lowers their profitability. To be precise, as soon as more than 41.5% of the total COLX coin supply is locked in masternodes, more than 50% of the block reward will be distributed to regular staking nodes. As long as the amount of locked collateral funds is below the threshold of 41.5%, the see-saw algorithm ensure that running a masternode is financially more attractive than running a simple staking node, to compensate for the additional effort that a masternode requires in comparison to a simple staking node.Please refer to our whitepaper for more information.
Q: What other marketplaces has the COLX team been in contact with?
Thanks guys! Love the coin and staff
A: ColossusXT gets in touch for different platforms based on community request and also based on partnership requests received upon ColossusXT business team’s mutual agreement. Unfortunately, these possibilities cannot be shared until they are mutually agreed between the partners and ColossusXT team due to non disclosure agreements.
Q: What do you think about the new rules that will soon govern crypto interactions in the EU? they are against anonymous payments
A: Blockchain technology is just now starting to become clear to different governments.
ColossusXT's privacy features protect the end-user from oversharing personal information. As you are probably aware from the multiple emails you've received recently from many websites.
Privacy policies are always being updated and expanded upon. The use of privacy features with utility coins like ColossusXT should be a regular norm throughout blockchain. This movement is part is about decentralization as much as it is about improving technology.
While this news may have a role to play. I don't think it is THE role that will continuously be played as blockchain technology is implemented throughout the world.
Q: Any hints on the next big feature implementation you guys are working on? According to road map - really excited to hear more about the Shared MN and the scale of the marketplace!
A: Current work is focused on the privacy layer of Colossus Grid and completing the updated wallet interface.
Q: Why choose COLX, or should I say why should we believe in COLX becoming what you promise in the roadmap. What are you different from all the other privacy coins with block chain establishment already in effect?
A: ColossusXT is an environmentally friendly Proof of Stake, with Masternode technology that provide dual layers of protection from 51% attacks. It includes privacy features that protect the user while the utilize resources from the Colossus Grid. Some of the previous questions within this AMA may also answer this question.
Q: What tradeoffs do you have using the Colossus Grid versus the more typical distribution?
A: The advantage of supercomputers is that since data can move between processors rapidly, all of the processors can work together on the same tasks. Supercomputers are suited for highly-complex, real-time applications and simulations. However, supercomputers are very expensive to build and maintain, as they consist of a large array of top-of-the-line processors, fast memory, custom hardware, and expensive cooling systems. They also do not scale well, since their complexity makes it difficult to easily add more processors to such a precisely designed and finely tuned system.By contrast, the advantage of distributed systems (Like Colossus Grid) is that relative to supercomputers they are much less expensive. Many distributed systems make use of cheap, off-the-shelf computers for processors and memory, which only require minimal cooling costs. In addition, they are simpler to scale, as adding an additional processor to the system often consists of little more than connecting it to the network. However, unlike supercomputers, which send data short distances via sophisticated and highly optimized connections, distributed systems must move data from processor to processor over slower networks making them unsuitable for many real-time applications.
Q: Why should I choose Colossus instead of another 100,000 altcoins?
A: Many of these alt-coins are all very different projects. ColossusXT is the only Grid computing project with a focus on user privacy. We have instant transactions, and zero-fee transactions and ColossusXT is one of the very few coins to offer live support. Check out our Whitepaper!
Q: Will there be an option (in the future) to choose between an anonymous or public transaction?
A: Zerocoin is an evolution of the current coin mixing feature. Both allow an individual to decide how they would like to send their transactions.
Q: What exchange has highest volume for ColossusXT, and are there any plans for top exchanges soon ?
A: Currently Cryptopia carries the majority of ColossusXT volume. We are speaking with many different exchanges, and preparing requested documentation for different exchanges. ColossusXT intends to be traded on every major exchange globally.
Q: What is the TPS speed that colx blockchain achieves?
A: ColossusXT achieves between 65-67 TPS depending on network conditions currently.
Q: Plans on expanding the dev team?
A: As development funds allow it, the team will be expanded. Development costs are high for a unique product like ColossusXT, and a good majority of our budget is allocated to it.
Q: Can you explain what is and what are the full porpose of the COLOSSUSXT GRID PROJECT ?
A: Colossus Grid is explained in the whitepaper. The uses for grid computing and storage are vast, and we are only starting to scratch the surface on what this type of computing power can do. There is also a description within the formatting context within the AMA of the Colossus Grid.
Q: Is there mobile wallet for Android and iOS? If not, is there a roadmap?
A: There Android wallet is out of beta and on the Google PlayStore: iOS wallet is planned for development.
The roadmap can be found here: https://colossusxt.io/roadmap/
Q: Is ColossusXT planning on partnering up with other cryptocurrency projects? Such as: Bread and EQUAL.
A: ColossusXT plans on partnering with other crypto projects that make sense. We look for projects that can help alleviate some of our development work / provide quality of life upgrades to our investors so that we can focus on Colossus Grid development. When absolutely love it when the community comes to us with great projects to explore.
Q: Did you ever considered a coinburn? Don't you think a coin burn will increase COLX price and sustain mass adoption? Do you plan on keeping the price of COLX in a range so the potential big investors can invest in a not so much volatile project?
A**:** There are no plans to do a coinburn at this time. Please check out our section in the whitepaper about the supply.
Q: what is the next big exchange for colx to be listed ?
A: There are several exchanges that will be listing ColossusXT soon. Stay tuned for updates within the community as some have already been announced and future announcements.
  1. CryptalDash
  2. NextExchange
  3. CoinPulse
  4. CoinSwitch (Crowdfunding)
  5. Plaak (Crowdfunding)
Q: How will Colx compete with other privacy coins which claim to be better like Privacy?
A: ColossusXT is not competing with other privacy coins. ColossusXT will evolve into the Colossus Grid, which is built on the backbone of a privacy blockchain. In our vision, all these other privacy coins are competing for relevancy with ColossusXT. There are also similar responses to question that may hit on specifics.
Q: Does COLX have a finite number of coins like bitcoin?
A: No, ColossusXT is Proof of Stake. https://en.wikipedia.org/wiki/Proof-of-stake
Q: What are the advantages of COLX over other competitor coins (eg. ECA)?
A: The only similarities between ColossusXT and Electra is that we are both privacy blockchains. ColossusXT is very much an entirely different project that any other privacy coin in the blockchain world today. The Colossus Grid will be a huge advantage over any other privacy coin. Offering the ability for a desktop machine to rent power from others contributing to the Colossus Grid and perform and compute high level tasks.
Q: How do you feel about some countries frowning upon privacy coins and how do you plan to change their minds (and what do you plan to do about it?)
A: The ColossusXT team tries to view opinions from multiple perspectives so that we can understand each line of thinking. As blockchain technology becomes more widely adopted, so will the understanding of the importance of the privacy features within ColossusXT. Privacy is freedom.
Q: How do you see COLX in disrupting cloud gaming services such as PlayStation Now?
A: Cloud gaming services have not been discussed. Initial marketing of our private grid computing framework will be targeted at homes users, governments, and cyber security firms who may require more discretion / anonymity in their work.
Q: Since colx is a privacy coin and is known for its privacy in the transactions due to which lot of money laundering and scams could take place, would colx and its community be affected due to it? And if does then how could we try to prevent it?
A: ColossusXT intends to be known for the Colossus Grid. The Colossus Grid development will be moved up from Q1 2019 to Q3 2018 to reflect this message and prevent further miscommunication about what privacy means for the future of ColossusXT. Previous answers within this AMA may further elaborate on this question.
Q: When do you plan to list your coin on other "bigger" exchanges?
A: ColossusXT is speaking with many different exchanges. These things have many different factors. Exchanges decide on listing dates and we expect to see ColossusXT listed on larger exchanges as we approach the Colossus Grid Beta. The governance system can further assist in funding.
Q: What was the rationale behind naming your coin ColossusXT?
A: Colossus was a set of computers developed by British codebreakers in the years 1943–1945. XT symbolises ‘extended’ as the coin was forked from the original Cv2 coin.
Q: Can you give any details about the E Commerce Marketplace, and its progress?
A: The Ecommerce Marketplace is a project that will receive attention after our development pass on important privacy features for the grid. In general, our roadmap will be changing to put an emphasis on grid development.
Q: How will someone access the grid, and how will you monetize using the grid? Will there be an interface that charges COLX for time on the grid or data usage?
A: The Colossus Grid will be integrated within the ColossusXT wallet. Buying & Selling resources will happen within the wallet interface. You won't be able to charge for "time" on the grid, and have access to unlimited resources. The goal is to have users input what resources they need, and the price they are willing to pay. The Colossus Grid will then look for people selling resources at a value the buyer is willing to pay. Time may come into play based on which resources you are specifically asking for.
Q: Are there any plans to launch an official YouTube channel with instructional videos about basic use of the wallets and features of COLX? Most people are visually set and learn much faster about wallets when actually seeing it happen before they try themselves. This might attract people to ColossusXT and also teach people about basic use of blockchain and cryptocurrency wallets. I ask this because I see a lot of users on Discord and Telegram that are still learning and are asking a lot of real basic questions.
A: ColossusXT has an official YT account with instructional videos: https://www.youtube.com/channel/UCCmMLUSK4YoxKvrLoKJnzng
Q: What are the usp's of colx in comparing to other privacy coins?
A: Privacy coins are a dime a dozen. ColossusXT has different end goals than most privacy coins, and this cannot be stated enough. Our goal is not just to be another currency, but to build a sophisticated computing resource sharing architecture on top of the privacy blockchain.
Q: A new exchange will probably gain more liquidity for our coin. If you might choose 3 exchanges to get COLX listed, what would be your top 3?
A: ColossusXT intends to be listed on all major exchanges globally. :)
Q: What is the future of privacy coins? What will be the future colx userbase (beyond the first adopters and enthusiasts)?
A: The future of privacy is the same it has always been. Privacy is something each and everyone person owns, until they give it away to someone else. Who is in control of your privacy? You or another person or entity?The future of the ColossusXT user base will comprise of early adopters, enthusiast, computer science professionals, artificial intelligence, and computational linguistics professionals for which these users can utilize the Colossus Grid a wide range of needs.
Q: Will ColossusXT join more exchanges soon??
A: Yes. :)
Q: So when will Colossus put out lots of advertisement to the various social media sites to get better known? Like Youtube videos etc.
A: As we get closer to a product launch of the Colossus Grid, you’ll begin to see more advertisements, YouTubers, and interviews. We’re looking to also provide some presentations at blockchain conferences in 2018, and 2019.
Q: In your opinion, what are some of the issues holding COLX back from wider adoption? In that vein, what are some of the steps the team is considering to help address those issues?
A: One of the main issues that is holding ColossusXT back from a wider adoption is our endgame is very different from other privacy coins. The Colossus Grid. In order to address this issue, the ColossusXT team intends to have a Colossus Grid Beta out by the end of Q4 and we will move development of the Colossus Grid from Q1 2019 to Q3 2018.
Q: Or to see it from another perspective - what are some of the biggest issues with crypto-currency and how does COLX address those issues?
A: Biggest issue is that cryptocurrency is seen as a means to make quick money, what project is going to get the biggest “pump” of the week, and there is not enough focus on building blockchain technologies that solve problems or creating legitimate business use cases.
For the most part we believe the base of ColossusXT supporters see our end-game, and are willing to provide us with the time and support to complete our vision. The ColossusXT team keeps its head down and keeps pushing forward.
Q: I know it's still early in the development phase but can you give a little insight into what to look forward to regarding In-wallet voting and proposals system for the community? How much power will the community have regarding the direction COLX development takes in the future?
A: The budget and proposal system is detailed in the whitepaper. Masternode owners vote on and guide the development of ColossusXT by voting on proposals put forth by the community and business partners.
Our goal is to make this process as easy and accessible as possible to our community.
Q: Will there be an article explaining the significance of each partnership formed thus far?
A: Yes, the ColossusXT team will announce partners on social media, and community outlets. A detailed article of what partnerships mean will be available on our Medium page: https://medium.com/@colossusxt
Q: What potential output from the Grid is expected and what would it's use be?
For example, x teraflops which could process y solutions to protein folding in z time.
A: There are many uses for grid computing. A crypto enthusiast mining crypto, a cyber security professional cracking a password using brute force, or a scientist producing climate prediction models.
The resources available to put towards grid projects will be determined by the number of nodes sharing resources, and the amount of resources an individual is willing to purchase with COLX.
All individuals will not have access to infinite grid resources.
Q: Is there a paper wallet available?
A: Yes, see https://mycolxwallet.org
Q: Is there a possibility of implementing quantum computer measures in the future?
A: This is a great idea for potentially another project in the future. Currently this is not possible with the Colossus Grid. Instead of bits, which conventional computers use, a quantum computer uses quantum bits—known as qubits. In classical computing, a bit is a single piece of information that can exist in two states – 1 or 0. Quantum computing uses quantum bits, or 'qubits' instead. These are quantum systems with two states. However, unlike a usual bit, they can store much more information than just 1 or 0, because they can exist in any superposition of these values.
Q: Do you plan to do a coin burn?
A: No future coin burns are planned. Anything like this would go through a governance proposal and Masternode owners would vote on this. This is not anything we’ve seen within the community being discussed.
Q: Can I check the exact number of current COLX master node and COLX staking node?
A: Yes. You can view the Masternodes and the amount of ColossusXT (COLX) being staked by viewing the block explorer.
Block explorer: https://chainz.cryptoid.info/colx/#!extraction
Q: What incentive could we give a youtuber to do the BEST video of ColossusXT (COLX)?
A: We've been approached by several YouTubers. The best thing a YouTuber can do is understand what ColossusXT is, join the community, ask questions if there is something they don't understand.
The problem with many YouTubers is that some of them are just trying to get paid, they don't really care to provide context or research a project.
Disclaimer: This is not all YouTubers, but many.
Q: In which ways is the ColossusGrid different from other supercomputer / distributed computing projects out there. Golem comes to mind. Thanks!
A: The main difference is that we are focused on the end users privacy, and the types of users that we will be targeting will be those that need more discretion / anonymity in their work. We are building framework that will continue to push the boundaries of user privacy as it relates to grid computing.
Q: Can we please complete our roadmap ahead of schedule? I find most other coins that do this actually excell in terms of price and community members. Keep on top of the game :)
A: The Colossus XT roadmap is a very fluid document, and it is always evolving. Some items are moved up in priority, and others are moved back. The roadmap should not be thought of something that is set in stone.
Q: Does COLX have master nodes?
A: Yes. ColossusXT has masternodes.
Q: Have thought about providing a method to insert a form of payment in colx in any page that wants to use cryptocurrencies in a fast and simple way in order to masive adoption????
A: There is already this option.https://mycryptocheckout.com/coins/
Q: What do you think your community progress till now?
A: The community has grown greatly in the last 3 months. We’re very excited to go from 13 to 100 questions in our quarterly AMA. Discord, Telegram, and Twitter are growing everyday.
Q: I noticed on Roadmap: Coinomi and ahapeshift wallet integration. Can you tell me more about this? I am new in crypto and new ColX investor so I don't know much about this. Thanks and keep a good work.
A: Coinomi is a universal wallet. ColossusXT will have multiple wallet platforms available to it. Shapeshift allows you to switch one crypto directly for another without the use of a coupler (BTC).
Q: Is "A general-purpose decentralized marketplace" written in the whitepaper the same as "E-COMMERCE MARKETPLACE" written on the roadmap?
Please tell me about "A general-purpose decentralized marketplace" or "E-COMMERCE MARKETPLACE" in detail.
A: Details will be posted as we get closer to the marketplace. It will be similar to other marketplaces within blockchain. Stay tuned for more information by following us on Twitter.
Q: History has shown that feature-based technologies always get replaced by technologies with platforms that incorporate those features; what is colossius big picture?
A: The Colossus Grid. Which has been explained within this AMA in a few different ways.
Q: What are the main objectives for COLX team this year? Provide me 5 reason why COLX will survive in a long term perspective? Do you consider masternodes working in a private easy to setup wallet on a DEX network? Already big fan, have a nice day!
A: Getting into Q3 our main object is to get a working product of the Colossus Grid by the end of Q4.
  1. Community - Our community is growing everyday as knowledge about what we’re building grows. When the Colossus Grid is online we expect expansion to grow at a rapid pace as users connect to share resources.
  2. Team - The ColossusXT team will continue to grow. We are stewards of a great community and an amazing project. Providing a level of support currently unseen in many other projects through Discord. The team cohesion and activity within the community is a standard we intend to set within the blockchain communities.
  3. Features - ColossusXT and The Colossus Grid will have user friendly AI. We understand the difficulties when users first enter blockchain products. The confusion between keys, sending/receiving addresses, and understanding available features within. Guides will always be published for Windows/Mac/Linux with updates so that these features can be easily understood.
  4. Colossus Grid - The Colossus Grid answers real world problems, and provides multiple solutions while also reducing energy consumption.
  5. Use Case - Many of the 1000+ other coins on the market don’t have the current use-case that ColossusXT has, let alone the expansion of utility use-cases in multiple sectors.
Q: Will the whitepaper be available in Portuguese?
A: Yes. We will be adding some language bounties to the website in the future. Stay tuned.
Q: Notice in your white paper there are future plans for decentralised governance and masternode voting. While all that is great, how do you plan on mitigating malicious proposals from getting through by gaming the system (i.e. bot votes, multiple accounts, spam,etc)?
A: You cannot game the system. Masternode owners get 1 vote.
Q: Been a massive fan of this project since Dec last year, anyways what was the reason you guys thought of putting XT at the end of Colossus. :)
A: XT symbolizes ‘extended’ as the coin was forked from the original Cv2 coin.
Q: Do you plan a partnership within the banking industry to capitalize on such large amounts of money being moved continuously?
A: The focus will be on the Colossus Grid and Grid computing, with the option to participate in the financial sector of Blockchain through Polis Pay, and other partnerships that can be announced in the future.
Q: When will be COLX supported By The Ledger Wallet?
A: Integration with cold storage wallet is planned. I myself (PioyPioyPioy) have a Nano Ledger S and I love it!
Q: Where do you see yourself in five years?
A: The goal 5 years from now would be to be a leading competitor in cloud computing and storage. Providing government, private cybersecurity, and individuals with efficient solutions to Super-computing, cloud storage through Blockchain infrastructure. I would like to see hardware options of connecting to the grid to utilize resources after the Colossus Grid is online, and I think this can contribute to many use-case scenarios.
Q: How can I suggest business partnerships and strategic ideas etc to the ColossusXT team?
A: Join us in Discord. Members of the team here are active daily, you can also contact us at: [[email protected]](mailto:[email protected])
Q: A great project requires good funding. How do you plan to incorporate fund sourcing and management into the long-term planning of this project
A: Check out our governance section within the whitepaper. :)
Website: https://colossusxt.io
Whitepaper: https://colossuscoinxt.org/whitepape
Roadmap: https://colossuscoinxt.org/roadmap/
Follow ColossusXT on:
Twitter: https://twitter.com/colossuscoinxt
Facebook Page: https://www.facebook.com/ColossusCoin/
Telegram: https://web.telegram.org/#/im?p=s1245563208_12241980906364004453
Discord: https://discord.gg/WrnAPcx
Apply to join the team: https://docs.google.com/forms/d/1YcOoY6nyCZ6aggJNyMU-Y5me8_gLTHkuDY4SrQPRe-4/viewform?edit_requested=true
Contribute an idea: https://colossusxt.fider.io/
Q2 AMA Questions: https://www.reddit.com/ColossuscoinX/comments/8ppkxf/official_colossusxt_ama_q2/
Previous AMA: https://www.reddit.com/ColossuscoinX/comments/8bia7o/official_colossusxt_ama/
submitted by PioyPioyPioy to ColossuscoinX [link] [comments]

DAG Technology Analysis and Measurement

The report produced by the fire block chain coins Institute, author: Yuan Yuming, Hu Zhiwei, PDF version please read the original text download
Summary
The Fire Coin Blockchain Application Research Institute conducts research on distributed ledger technology based on directed acyclic graph (DAG) data structure from a technical perspective, and through the specific technical test of typical representative project IOTA, the main research results are obtained:
Report body
1 Introduction
Blockchain is a distributed ledger technology, and distributed ledger technology is not limited to the "blockchain" technology. In the wave of digital economic development, more distributed ledger technology is being explored and applied in order to improve the original technology and meet more practical business application scenarios. Directed Acylic Graph (hereinafter referred to as "DAG") is one of the representatives.
What is DAG technology and the design behind it? What is the actual application effect?We attempted to obtain analytical conclusions through deep analysis of DAG technology and actual test runs of representative project IOTA.
It should also be noted that the results of the indicator data obtained from the test are not and should not be considered as proof or confirmation of the final effect of the IOTA platform or project. Hereby declare.
2. Main conclusions
After research and test analysis, we have the following main conclusions and technical recommendations:
3.DAG Introduction
3.1. Introduction to DAG Principle
DAG (Directed Acyclic Graph) is a data structure that represents a directed graph, and in this graph, it cannot return to this point (no loop) from any vertex, as shown in the figure. Shown as follows:
📷
After the DAG technology-based distributed ledger (hereinafter referred to as DAG) technology has been proposed in recent years, many people think that it is hopeful to replace the blockchain technology in the narrow sense. Because the goal of DAG at design time is to preserve the advantages of the blockchain and to improve the shortcomings of the blockchain.
Different from the traditional linear blockchain structure, the transaction record of the distributed ledger platform represented by IOTA forms a relational structure with a directed acyclic graph, as shown in the following figure.
📷
3.2. DAG characteristics
Due to the different data structure from the previous blockchain, the DAG-based distributed ledger technology has the characteristics of high scalability, high concurrency and is suitable for IoT scenarios.
3.2.1. High scalability, high concurrency
The data synchronization mechanism of traditional linear blockchains (such as Ethereum) is synchronous, which may cause network congestion. The DAG network adopts an asynchronous communication mechanism, allowing concurrent writing. Multiple nodes can simultaneously trade at different tempos without having a clear sequence. Therefore, the data of the network may be inconsistent at the same time, but it will eventually be synchronized.

3.2.2. Applicable to IoT scenarios

In the traditional blockchain network, there are many transactions in each block. The miners are packaged and sent uniformly, involving multiple users. In the DAG network, there is no concept of “block”, the smallest unit of the network. It is a "transaction", each new transaction needs to verify the first two transactions, so the DAG network does not need miners to pass the trust, transfer does not require a fee, which makes DAG technology suitable for small payments.
4. Analysis of technical ideas
Trilemma, or "trilemma", means that in a particular situation, only two of the three advantageous options can be selected or one of the three adverse choices must be chosen. This type of selection dilemma has related cases in various fields such as religion, law, philosophy, economics, and business management.Blockchain is no exception. The impossible triangle in the blockchain is: Scalability, Decentralization, and Security can only choose two of them.
If you analyze DAG technology according to this idea, according to the previous introduction, then DAG has undoubtedly occupied the two aspects of decentralization and scalability. The decentralization and scalability of the DAG can be considered as two-sided, because of the asynchronous accounting features brought about by the DAG data structure, while achieving the high degree of decentralization of the participating network nodes and the scalability of the transaction.
5. There is a problem
Since the characteristics of the data structure bring decentralization and scalability at the same time, it is speculated that the security is a hidden danger according to the theory of impossible triangles. But because DAG is a relatively innovative and special structure, can it be more perfect to achieve security? This is not the case from the actual results.
5.1. Double flower problem
The characteristics of DAG asynchronous communication make it possible for a double-flower attack. For example, an attacker adds two conflicting transactions (double spending) at two different locations on the network, and the transactions are continuously forward-checked in the network until they appear on the verification path of the same transaction, and the network discovers the conflict. At this time, the common ancestor nodes that the two transactions are gathered together can determine which transaction is a double-flower attack.
If the trading path is too short, there will be a problem like "Blowball": when most transactions are "lazy" in extreme cases, only the early trading, the trading network will form a minority. Early transactions are the core central topology. This is not a good thing for DAGs that rely on ever-increasing transactions to increase network reliability.
Therefore, at present, for the double flower problem, it is necessary to comprehensively consider the actual situation for design. Different DAG networks have their own solutions.
5.2. Shadow chain problem
Due to the potential problem of double flowers, when an attacker can build a sufficient number of transactions, it is possible to fork a fraudulent branch (shadow chain) from the real network data, which contains a double flower transaction, and then this The branch is merged into the DAG network, and in this case it is possible for this branch to replace the original transaction data.
6. Introduction to the current improvement plan
At present, the project mainly guarantees safety by sacrificing the native characteristics of some DAGs.
The IOTA project uses the Markov chain Monte Carlo (MCMC) approach to solve this problem. The IOTA introduces the concept of Cumulative Weight for transactions to record the number of times the transaction has been cited in order to indicate the importance of its transaction. The MCMC algorithm selects the existing transactions in the current network as a reference for the newly added transactions by weighting the random weights of the accumulated weights. That is, the more referenced the transaction path, the easier it is to be selected by the algorithm. The walk strategy has also been optimized in version 1.5.0 to control the "width" of the transaction topology to a reasonable range, making the network more secure.
However, at the beginning of the platform startup, due to the limited number of participating nodes and transactions, it is difficult to prevent a malicious organization from sending a large number of malicious transactions through a large number of nodes to cause the entire network to be attacked by the shadow chain. Therefore, an authoritative arbitration institution is needed to determine the validity of the transaction. In IOTA, this node is a Coordinator, which periodically snapshots the current transaction data network (Tangle); the transactions contained in the snapshot are confirmed as valid transactions. But Coordinator doesn't always exist. As the entire network runs and grows, IOTA will cancel the Coordinator at some point in the future.
The Byteball improvement program features its design for the witness and the main chain. Because the structure of DAG brings a lot of transactions with partial order, and to avoid double flowers, it is necessary to establish a full order relationship for these transactions to form a transaction backbone. An earlier transaction on the main chain is considered a valid transaction.Witnesses, who are held by well-known users or institutions, form a main chain by constantly sending transactions to confirm other user transactions.
The above scheme may also bring different changes to the platform based on the DAG structure. Taking IOTA as an example, because of the introduction of Coordinator, the decentralization characteristics are reduced to some extent.
7. Actual operation
7.1. Positive effects
In addition to solving security problems, the above solutions can also solve the smart contract problem to some extent.
Due to the two potential problems caused by the native features of DAG: (1) The transaction duration is uncontrollable. The current mechanism for requesting retransmission requires some complicated timeout mechanism design on the client side, hoping for a simple one-time confirmation mechanism. (2) There is no global sorting mechanism, which results in limited types of operations supported by the system. Therefore, on the distributed ledger platform based on DAG technology, it is difficult to implement Turing's complete intelligent contract system.
In order to ensure that the smart contract can run, an organization is needed to do the above work. The current Coordinator or main chain can achieve similar results.
7.2. Negative effects
As one of the most intuitive indicators, DAG's TPS should theoretically be unlimited. If the maximum TPS of the IOTA platform is compared to the capacity of a factory, then the daily operation of TPS is the daily production of the plant.
For the largest TPS, the April 2017 IOTA stress test showed that the network had transaction processing capabilities of 112 CTPS and 895 TPS. This is the result of a small test network consisting of 250 nodes.
For the daily operation of TPS, from the data that is currently publicly available, the average TPS of the main network in the near future is about 8.2, and the CTPS (the number of confirmed transactions per second) is about 2.7.
📷
The average average TPS of the test network is about 4, and the CTPS is about 3.
📷
Data source discord bot: generic-iota-bot#5760
Is this related to the existence of Coordinator? Actual testing is needed to further demonstrate.
8. Measured analysis
The operational statistics of the open test network are related to many factors.For further analysis, we continue to use the IOTA platform as an example to build a private test environment for technical measurement analysis.
8.1. Test Architecture
The relationship between the components we built this test is shown below.
📷
among them:
8.2. Testing the hardware environment
The server uses Amazon AWS EC2 C5.4xlarge: 16 core 3GHz, Intel Xeon Platinum 8124M CPU, 32GB memory, 10Gbps LAN network between servers, communication delay (ping) is less than 1ms, operating system is Ubuntu 16.04.
8.3. Test scenarios and results analysis

8.3.1. Default PoW Difficulty Value

Although there is no concept such as “miners”, the IOTA node still needs to prove the workload before sending the transaction to avoid sending a large number of transactions to flood the network. The Minimum Weight Magnitude is similar to Bitcoin. The result of PoW should be the number of digits of "9", 9 of which is "000" in the ternary used by IOTA. The IOTA difficulty value can be set before the node is started.
Currently for the production network, the difficulty value of the IOTA is set to 14; the test network is set to 9. Therefore, we first use the test network's default difficulty value of 9 to test, get the following test results.
📷
Since each IOTA's bundle contains multiple transfers, the actual processed TPS will be higher than the send rate. But by executing the script that parses zmq, it can be observed that the current TPS is very low. Another phenomenon is that the number of requests that can be sent successfully per second is also low.
After analysis, the reason is that the test uses VPS, so in PoW, the CPU is mainly used for calculation, so the transaction speed is mainly affected by the transmission speed.

8.3.2. Decrease the PoW difficulty value

Re-test the difficulty value to 1 and get the following results.
📷
As can be seen from the results, TPS will increase after the difficulty is reduced. Therefore, the current TPS of the IOTA project does not reach the bottleneck where the Coordinator is located, but mainly because of the hardware and network of the client itself that sends the transaction. The IOTA community is currently working on the implementation of FPGA-based Curl algorithm and CPU instruction set optimization. Our test results also confirm that we can continue to explore the performance potential of the DAG platform in this way.

8.3.3. Reduce the number of test network nodes

Due to the characteristics of DAG, the actual TPS of the platform and the number of network nodes may also be related. Therefore, when the difficulty value is kept at 1, the number of network nodes is reduced to 10 and the test is repeated to obtain the following results.
📷
As can be seen from the results, as the number of nodes decreases, the actual processing of TPS also decreases, and is lower than the transmission rate. This shows that in a DAG environment, maintaining a sufficient size node will facilitate the processing of the transaction.
9. Reference materials
Https://www.iota.org/
https://en.wikipedia.org/wiki/Trilemma
Https://blog.iota.org/new-tip-selection-algorithm-in-iri-1-5-0-61294c1df6f1
https://en.wikipedia.org/wiki/Markov\_chain\_Monte\_Carlo
Https://byteball.org/
Https://www.iotachina.com/iota.html
Https://www.iotachina.com/iota\_tutorial\_1.html
submitted by i0tal0ver to Iota [link] [comments]

IRC Log from Ravencoin Open Developer Meeting - Aug 24, 2018

[14:05] <@wolfsokta> Hello Everybody, sorry we're a bit late getting started
[14:05] == block_338778 [[email protected]/web/freenode/ip.72.214.222.226] has joined #ravencoin-dev
[14:06] <@wolfsokta> Here are the topics we would like to cover today • 2.0.4 Need to upgrade - What we have done to communicate to the community • Unique Assets • iOS Wallet • General Q&A
[14:06] == Chatturga changed the topic of #ravencoin-dev to: 2.0.4 Need to upgrade - What we have done to communicate to the community • Unique Assets • iOS Wallet • General Q&A
[14:06] <@wolfsokta> Daben, could you mention what we have done to communicate the need for the 2.0.4 upgrade?
[14:07] == hwhwhsushwban [[email protected]/web/freenode/ip.172.58.37.35] has joined #ravencoin-dev
[14:07] <@wolfsokta> Others here are free to chime in where they saw the message first.
[14:07] == hwhwhsushwban [[email protected]/web/freenode/ip.172.58.37.35] has quit [Client Quit]
[14:08] Whats up bois
[14:08] hi everyone
[14:08] hi hi
[14:08] <@wolfsokta> Discussing the 2.0.4 update and the need to upgrade.
[14:08] <@Chatturga> Sure. As most of you are aware, the community has been expressing concerns with the difficulty oscillations, and were asking that something be done to the difficulty retargeting. Many people submitted suggestions, and the devs decided to implement DGW.
[14:09] <@Tron> I wrote up a short description of why we're moving to a new difficulty adjustment. https://medium.com/@tronblack/ravencoin-dark-gravity-wave-1da0a71657f7
[14:09] <@Chatturga> I have made posts on discord, telegram, bitcointalk, reddit, and ravencointalk.org from testnet stages through current.
[14:10] <@Chatturga> If there are any other channels that can reach a large number of community members, I would love to have more.
[14:10] <@wolfsokta> Thanks Tron, that hasn't been shared to the community at large yet, but folks feel free to share it.
[14:10] When was this decision made and by whom and how?
[14:10] <@Chatturga> I have also communicated with the pool operators and exchanges about the update. Of all of the current pools, only 2 have not yet updated versions.
[14:11] <@wolfsokta> The decision was made by the developers through ongoing requests for weeks made by the community.
[14:12] <@wolfsokta> Evidence was provided by the community of the damages that could be caused to projects when the wild swings continue.
[14:12] So was there a meeting or vote? How can people get invited
[14:12] <@Tron> It was also informed by my conversations with some miners that recommended that we make the change before the coin died. They witnessed similar oscillations from which other coins never recovered.
[14:13] only two pools left to upgrade is good, what about the exchanges? Any word on how many of those have/have not upgraded?
[14:13] <@wolfsokta> We talked about here in our last meeting Bruce_. All attendees were asked if they had any questions or concerns.
[14:13] == blondfrogs [[email protected]/web/freenode/ip.185.245.87.219] has joined #ravencoin-dev
[14:13] == roshii [[email protected]/web/freenode/ip.41.251.25.100] has joined #ravencoin-dev
[14:13] sup roshii long time no see
[14:14] <@Chatturga> Bittrex, Cryptopia, and IDCM have all either updated or have announced their intent to update.
[14:14] == wjcgiwgu283ik3cj [[email protected]/web/freenode/ip.172.58.37.35] has joined #ravencoin-dev
[14:15] sup russki
[14:15] what's the status here?
[14:15] I don’t think that was at all clear from the last dev meeting
[14:15] I can’t be the only person who didn’t understand it
[14:15] <@wolfsokta> Are there any suggestions on how to communicate the need to upgrade even further? I am concerned that others might also not understand.
[14:17] I’m not sold on the benefit and don’t understand the need for a hard fork — I think it’s a bad precedent to simply go rally exchanges to support a hard fork with little to no discussion
[14:17] so just to note, the exchanges not listed as being upgraded or have announced their intention to upgrade include: qbtc, upbit, and cryptobridge (all with over $40k usd volume past 24 hours according to coinmarketcap)
[14:18] <@wolfsokta> I don't agree that there was little or no discussion at all.
[14:19] <@wolfsokta> Looking back at our meeting notes from two weeks ago "fork" was specifically asked about by BrianMCT.
[14:19] If individual devs have the power to simple decide to do something as drastic as a hard fork and can get exchanges and miners to do it that’s got a lot of issues with centralization
[14:19] <@wolfsokta> It had been implemented on testnet by then and discussed in the community for several weeks before that.
[14:19] == under [[email protected]/web/freenode/ip.72.200.168.56] has joined #ravencoin-dev
[14:19] howdy
[14:19] Everything I’ve seen has been related to the asset layer
[14:19] I have to agree with Bruce_, though I wasn't able to join the last meeting here. That said I support the fork
[14:20] Which devs made this decision to do a fork and how was it communicated?
[14:20] well mostly the community made the decision
[14:20] Consensus on a change is the heart of bitcoin development and I believe the devs have done a great job building that consensus
[14:20] a lot of miners were in uproar about the situation
[14:20] <@wolfsokta> All of the devs were supporting the changes. It wasn't done in isolation at all.
[14:21] This topic has been a huge discussion point within the RVN mining community for quite some time
[14:21] the community and miners have been having issues with the way diff is adjusted for quite some time now
[14:21] Sure I’m well aware of that -
[14:21] Not sold on the benefits of having difficulty crippled by rented hashpower?
[14:21] The community saw a problem. The devs got together and talked about a solution and implemented a solution
[14:21] I’m active in the community
[14:22] So well aware of the discussions on DGW etc
[14:22] Hard fork as a solution to a problem community had with rented hashpower (nicehash!!) sounds like the perfect decentralized scenario!
[14:23] hard forks are very dangerous
[14:23] mining parties in difficulty drops are too
[14:23] <@wolfsokta> Agreed, we want to keep them to an absolute minimum.
[14:23] But miners motivation it’s the main vote
[14:24] What would it take to convince you that constantly going from 4 Th/s to 500 Gh/s every week is worse for the long term health of the coin than the risk of a hard fork to fix it?
[14:24] == Tron [[email protected]/web/freenode/ip.173.241.144.77] has quit [Ping timeout: 252 seconds]
[14:24] This hardfork does include the asset layer right? if so why is it being delayed in implementation?
[14:24] <@wolfsokta> Come back Tron!
[14:24] coudl it have been implement through bip9 voting?
[14:24] also hard fork is activated by the community! that's a vote thing!
[14:24] @mrsushi to give people time to upgrade their wallet
[14:25] @under, it would be much hard to keep consensus with a bip9 change
[14:25] <@wolfsokta> We investigated that closely Under.
[14:25] == Tron [[email protected]/web/freenode/ip.173.241.144.77] has joined #ravencoin-dev
[14:25] <@wolfsokta> See Tron's post for more details about that.
[14:25] <@spyder_> Hi Tron
[14:25] <@wolfsokta> https://medium.com/@tronblack/ravencoin-dark-gravity-wave-1da0a71657f7
[14:25] Sorry about that. Computer went to sleep.
[14:26] I'm wrong
[14:26] 2 cents. the release deadline of october 31st puts a bit of strain on getting code shipped. (duh). but fixing daa was important to the current health of the coin, and was widely suppported by current mining majority commuity. could it have been implemented in a different manner? yes . if we didnt have deadlines
[14:27] == wjcgiwgu283ik3cj [[email protected]/web/freenode/ip.172.58.37.35] has quit [Quit: Page closed]
[14:27] sushi this fork does not include assets. it's not being delayed though, we're making great progress for an Oct 31 target
[14:28] I don’t see the urgency but my vote doesn’t matter since my hash power is still CPUs
[14:28] <@wolfsokta> We're seeing the community get behind the change as well based on the amount of people jumping back in to mine through this last high difficulty phase.
[14:28] So that will be another hardfork?
[14:28] the fork does include the asset code though set to activate on oct 30th
[14:28] yes
[14:29] <@wolfsokta> Yes, it will based on the upgrade voting through the BIP9 process.
[14:29] I wanted to ask about burn rates from this group: and make a proposal.
[14:29] we're also trying hard to make it the last for awhile
[14:29] Can you clear up the above — there will be this one and another hard fork?
[14:29] <@wolfsokta> Okay, we could discuss that under towards the end of the meeting.
[14:30] If this one has the asset layer is there something different set for October
[14:30] <@wolfsokta> Yes, there will be another hard fork on October 31st once the voting process is successful.
[14:31] <@wolfsokta> The code is in 2.0.4 now and assets are active on testnet
[14:31] Bruce, the assets layer is still being worked on. Assets is active on mainnet. So in Oct 31 voting will start. and if it passes, the chain will fork.
[14:31] this one does NOT include assets for mainnet Bruce -- assets are targeted for Oct 31
[14:31] not***
[14:31] not active****
[14:31] correct me if I'm wrong here, but if everyone upgrades to 2.0.4 for this fork this week, the vote will automatically pass on oct 31st correct? nothing else needs to be done
[14:31] Will if need another download or does this software download cover both forks?
[14:31] <@wolfsokta> Correct Urgo
[14:32] thats how the testnet got activated and this one shows "asset activation status: waiting until 10/30/2018 20:00 (ET)"
[14:32] Will require another upgrade before Oct 31
[14:32] thank you for the clarification wolfsokta
[14:32] <@wolfsokta> It covers both forks, but we might have additional bug fixes in later releases.
[14:32] So users DL one version now and another one around October 30 which activates after that basically?
[14:33] I understand that, but I just wanted to make it clear that if people upgrade to this version for this fork and then don't do anything, they are also voting for the fork on oct 31st
[14:33] Oh okay — one DL?
[14:33] Bruce, Yes.
[14:33] Ty
[14:33] well there is the issue that there maybe some further consensus bugs dealing with the pruneability of asset transactions that needs to be corrected between 2.0.4 and mainnet. so i would imagine that there will be further revisions required to upgrade before now and october 31
[14:33] @under that is correct.
[14:34] I would highly recommend bumping the semver up to 3.0.0 for the final pre 31st release so that the public know to definitely upgrade
[14:34] @under +1
[14:35] out of curiosity, have there been many bugs found with the assets from the version released in july for testnet (2.0.3) until this version? or is it solely a change to DGW?
[14:35] <@wolfsokta> That's not a bad idea under.
[14:35] <@spyder_> @under good idea
[14:35] @urgo. Bugs are being found and fixed daily.
[14:35] Any time the protocol needs to change, there would need to be a hard fork (aka upgrade). It is our hope that we can activate feature forks through the BIP process (as we are doing for assets). Mining pools and exchanges will need to be on the newest software at the point of asset activation - should the mining hash power vote for assets.
[14:35] blondfrogs: gotcha
[14:35] There have been bugs found (and fixed). Testing continues. We appreciate all the bug reports you can give us.
[14:36] <@wolfsokta> Yes! Thank you all for your help in the community.
[14:37] (pull requests with fixes and test coverage would be even better!)
[14:37] asset creation collision is another major issue. current unfair advantage or nodes that fore connect to mining pools will have network topologies that guarantee acceptance. I had discussed the possibility of fee based asset creation selection and i feel that would be a more equal playing ground for all users
[14:38] *of nodes that force
[14:38] <@wolfsokta> What cfox said, we will always welcome development help.
[14:38] So just to make sure everyone know. When assets is ready to go live on oct 31st. Everyone that wants to be on the assets chain without any problems will have to download the new binary.
[14:39] <@wolfsokta> The latest binary.
[14:39] under: already in the works
[14:39] excellent to hear
[14:39] == UserJonPizza [[email protected]/web/freenode/ip.24.218.60.237] has joined #ravencoin-dev
[14:39] <@wolfsokta> Okay, we've spent a bunch of time on that topic and I think it was needed. Does anybody have any other suggestions on how to get the word out even more?
[14:40] maybe preface all 2.0.X releases as pre-releases... minimize the number of releases between now and 3.0 etc
[14:41] <@wolfsokta> Bruce_ let's discuss further offline.
[14:41] wolfsokta: which are the remaining two pools that need to be upgraded? I've identified qbtc, upbit, and cryptobridge as high volume exchanges that haven't said they were going to do it yet
[14:41] so people can help reach out to them
[14:41] f2pool is notoriously hard to contact
[14:41] are they on board?
[14:42] <@wolfsokta> We could use help reaching out to QBTC and Graviex
[14:42] I can try to contact CB if you want?
[14:42] <@Chatturga> The remaining pools are Ravenminer and PickAxePro.
[14:42] <@Chatturga> I have spoken with their operators, the update just hasnt been applied yet.
[14:42] ravenminer is one of the largest ones too. If they don't upgrade that will be a problem
[14:42] okay good news
[14:42] (PickAxePro sounds like a Ruby book)
[14:43] I strongly feel like getting the word out on ravencoin.org would be beneficial
[14:44] that site is sorely in need of active contribution
[14:44] Anyone can volunteer to contribute
[14:44] <@wolfsokta> Okay, cfox can you talk about the status of unique assets?
[14:44] sure
[14:45] <@wolfsokta> I'll add website to the end of our topics.
[14:45] code is in review and will be on the development branch shortly
[14:45] would it make sense to have a page on the wiki (or somewhere else) that lists the wallet versions run by pools & exchanges?
[14:45] will be in next release
[14:45] furthermore, many sites have friendly link to the standard installers for each platform, if the site linked to the primary installers for each platform to reduce github newb confusion that would be good as well
[14:46] likely to a testnetv5 although that isn't settled
[14:46] <@wolfsokta> Thanks cfox.
[14:46] <@wolfsokta> Are there any questions about unique assets, and how they work?
[14:47] after the # are there any charachters you cant use?
[14:47] will unique assets be constrained by the asset alphanumeric set?
[14:47] ^
[14:47] <@Chatturga> @Urgo there is a page that tracks and shows if they have updated, but it currently doesnt show the actual version that they are on.
[14:47] a-z A-Z 0-9
[14:47] <@Chatturga> https://raven.wiki/wiki/Exchange_notifications#Pools
[14:47] There are a few. Mostly ones that mess with command-line
[14:47] you'll be able to use rpc to do "issueunique MATRIX ['Neo','Tank','Tank Brother']" and it will create three assets for you (MATRIX#Neo, etc.)
[14:47] @cfox - No space
[14:48] @under the unique tags have an expanded set of characters allowed
[14:48] Chatturga: thank you
[14:48] @UJP yes there are some you can't use -- I'll try to post gimmie a sec..
[14:49] Ok. Thank you much!
[14:49] 36^36 assets possible and 62^62 uniques available per asset?
[14:49] <@spyder_> std::regex UNIQUE_TAG_CHARACTERS("^[[email protected]$%&*()[\\]{}<>_.;?\\\\:]+$");
[14:50] regex UNIQUE_TAG_CHARACTERS("^[[email protected]$%&*()[\\]{}<>_.;?\\\\:]+$")
[14:50] oh thanks Mark
[14:51] <@wolfsokta> Okay, next up. I want to thank everybody for helping test the iOS wallet release.
[14:51] <@wolfsokta> We are working with Apple to get the final approval to post it to the App Store
[14:51] @under max asset length is 30, including unique tag
[14:51] Does the RVN wallet have any other cryptos or just RVN?
[14:52] == BruceFenton [[email protected]/web/freenode/ip.67.189.233.170] has joined #ravencoin-dev
[14:52] will the android and ios source be migrated to the ravenproject github?
[14:52] I've been adding beta test users. I've added about 80 new users in the last few days.
[14:52] <@wolfsokta> Just RVN, and we want to focus on adding the asset support to the wallet.
[14:53] == Bruce_ [[email protected]/web/freenode/ip.67.189.233.170] has quit [Ping timeout: 252 seconds]
[14:53] <@wolfsokta> Yes, the code will also be freely available on GitHub for both iOS and Android. Thank you Roshii!
[14:53] Would you consider the iOS wallet to be a more secure place for one's holdings than say, a Mac connected to the internet?
[14:53] will there be a chance of a more user freindly wallet with better graphics like the iOS on PC?
[14:53] the android wallet is getting updated for DGW, correct?
[14:53] <@wolfsokta> That has come up in our discussion Pizza.
[14:54] QT framework is pretty well baked in and is cross platform. if we get some qt gurus possibly
[14:54] Phones are pretty good because the wallet we forked uses the TPM from modern phones.
[14:54] Most important is to write down and safely store your 12 word seed.
[14:54] TPM?
[14:54] <@wolfsokta> A user friendly wallet is one of our main goals.
[14:55] TPM == Trusted Platform Module
[14:55] Ahhh thanks
[14:55] just please no electron apps. they are full of security holes
[14:55] <@spyder_> It is whats makes your stuffs secure
[14:55] not fit for crypto
[14:55] under: depends on who makes it
[14:55] The interface screenshots I've seen look like Bread/Loaf wallet ... I assume that's what was forked from
[14:55] ;)
[14:56] <@wolfsokta> @roshii did you see the question about the Android wallet and DGW?
[14:56] Yes, it was a fork of breadwallet. We like their security.
[14:56] chromium 58 is the last bundled electron engine and has every vuln documented online by google. so unless you patch every vuln.... methinks not
[14:56] Agreed, great choice
[14:57] <@wolfsokta> @Under, what was your proposal?
[14:58] All asset creation Transactions have a mandatory OP_CHECKLOCKTIMEVERIFY of 1 year(or some agreed upon time interval), and the 500 RVN goes to a multisig devfund, run by a custodial group. We get: 1) an artificial temporary burn, 2) sustainable community and core development funding for the long term, after OSTK/Medici 3) and the reintroduction of RVN supply at a fixed schedule, enabling the removal of the 42k max cap of total As
[14:58] *im wrong on the 42k figure
[14:58] <@wolfsokta> Interesting...
[14:59] <@wolfsokta> Love to hear others thoughts.
[14:59] Update: I posted a message on the CryptoBridge discord and one of their support members @stepollo#6276 said he believes the coin team is already aware of the fork but he would forward the message about the fork over to them right now anyway
[14:59] Ifs 42 million assets
[14:59] yep.
[15:00] I have a different Idea. If the 500 RVN goes to a dev fund its more centralized. The 500 RVN should go back into the unmined coins so miners can stay for longer.
[15:01] *without a hardfork
[15:01] <@wolfsokta> lol
[15:01] that breaks halving schedule, since utxos cant return to an unmined state.
[15:01] @UJP back into coinbase is interesting. would have to think about how that effects distribution schedule, etc.
[15:01] only way to do that would be to dynamicaly grow max supply
[15:02] and i am concerned already about the max safe integer on various platforms at 21 billion
[15:02] js chokes on ravencoin already
[15:02] <@wolfsokta> Other thoughts on Under's proposal? JS isn't a real language. ;)
[15:02] Well Bitcoin has more than 21 bn Sats
[15:02] Is there somebody who wants to volunteer to fix js.
[15:02] hahaha
[15:03] I honestly would hate for the coins to go to a dev fund. It doesn't seem like Ravencoin to me.
[15:03] Yep, but we're 21 billion x 100,000,000 -- Fits fine in a 64-bit integer, but problematic for some languages.
[15:03] <@wolfsokta> Thanks UJP
[15:04] <@wolfsokta> We're past time but I would like to continue if you folks are up for it.
[15:04] Yeah no coins can go anywhere centrality contorted like a dev fund cause that would mean someone has to run it and the code can’t decide that so it’s destined to break
[15:05] currently and long term with out the financial backing of development then improvements and features will be difficult. we are certainly thankful for our current development model. but if a skunkworks project hits a particular baseline of profitability any reasonable company would terminate it
[15:05] Yes let’s contibue for sure
[15:05] the alternative to a dev fund in my mind would be timelocking those funds back to the issuers change address
[15:06] But we can’t have dev built in to the code — it has to be open source like Bitcoin and monero and Litecoin - it’s got drawbacks but way more advantages- it’s the best model
[15:06] Dev funding
[15:06] i highly reccommend not reducing the utility of raven by removing permanently the supply
[15:07] == BW_ [[email protected]/web/freenode/ip.138.68.243.202] has joined #ravencoin-dev
[15:07] timelocking those funds accompllishes the same sacrifice
[15:07] @under timelocking is interesting too
[15:07] How exactly does timelocking work?
[15:07] <@wolfsokta> ^
[15:07] I mean you could change the price of assets with the Block reward halfing.
[15:07] == Roshiix [[email protected]/web/freenode/ip.105.67.2.212] has joined #ravencoin-dev
[15:08] funds cant be spent from an address until a certain time passes
[15:08] but in a what magical fairy land do people continue to work for free forever. funding development is a real issue... as much as some might philosphically disagree. its a reality
[15:08] You’d still need a centralized party to decide how to distribute the funds
[15:08] even unofficially blockstream supports bitcoin devs
[15:08] on chain is more transparent imho
[15:09] == Tron_ [[email protected]/web/freenode/ip.173.241.144.77] has joined #ravencoin-dev
[15:09] @UJP yes there are unlimited strategies. one factor that I think is v important is giving application developers a way to easily budget for projects which leads to flat fees
[15:09] If the project is a success like many of believe it will be, I believe plenty of people will gladly done to a dev fund. I don't think the 500 should be burned.
[15:09] *donate
[15:09] centralized conservatorship, directed by community voting process
[15:10] == Tron [[email protected]/web/freenode/ip.173.241.144.77] has quit [Ping timeout: 252 seconds]
[15:10] <@wolfsokta> Thanks Under, that's an interesting idea that we should continue to discuss in the community. You also mentioned the existing website.
[15:10] It would need to be something where everyone with a QT has a vote
[15:10] think his computer went to sleep again :-/
[15:10] I agree UJP
[15:10] with the website
[15:10] No that’s ico jargon — any development fund tied to code would have to be centralized and would therefor fail
[15:11] ^
[15:11] ^
[15:11] ^
[15:11] dashes model for funding seems to be pretty decentralized
[15:11] community voting etc
[15:11] Once you have a dev fund tied to code then who gets to run it? Who mediates disputes?
[15:11] oh well another discussion
[15:11] Dash has a CEO
[15:12] <@wolfsokta> Yeah, let's keep discussing in the community spaces.
[15:12] Dash does have a good model. It's in my top ten.
[15:12] having the burn go to a dev fund is absolute garbage
[15:12] These dev chats should be more target than broad general discussions — changing the entire nature of the coin and it’s economics is best discussed in the RIPs or other means
[15:13] <@wolfsokta> Yup, let's move on.
[15:13] just becuase existing implementation are garbage doesnt mean that all possible future governance options are garbage
[15:13] <@wolfsokta> To discussing the website scenario mentioned by under.
[15:13] the website needs work. would be best if it could be migrated to github as well.
[15:13] What about this: Anyone can issue a vote once the voting feature has been added, for a cost. The vote would be what the coins could be used for.
[15:14] features for the site that need work are more user friendly links to binaries
[15:14] <@wolfsokta> We investigated how bitcoin has their website in Github to make it easy for contributors to jump in.
[15:14] that means active maintenance of the site instead of its current static nature
[15:15] <@wolfsokta> I really like how it's static html, which makes it super simple to host/make changes.
[15:15] the static nature isn’t due to interface it’s due to no contributors
[15:15] no contribution mechanism has been offered
[15:15] github hosted would allow that
[15:16] We used to run the Bitcoin website from the foundation & the GitHub integration seemed to cause some issues
[15:16] its doesnt necessarily have to be hosted by github but the page source should be on github and contributions could easily be managed and tracked
[15:17] for example when a new release is dropped, the ability for the downlaods section to have platform specific easy links to the general installers is far better for general adoption than pointing users to github releases
[15:18] <@wolfsokta> How do people currently contribute to the existing website?
[15:18] they dont?
[15:18] We did that and it was a complete pain to host and keep working — if someone wants to volunteer to do that work hey can surely make the website better and continually updated — but they could do that in Wordpress also
[15:19] I’d say keep an eye out for volunteers and maybe we can get a group together who can improve the site
[15:19] == digitalvap0r-xmr [[email protected]/web/cgi-irc/kiwiirc.com/ip.67.255.25.134] has joined #ravencoin-dev
[15:19] And they can decide best method
[15:20] I host the source for the explorer on github and anyone can spin it up instantly on a basic aws node. changes can be made to interface etc, and allow for multilingual translations which have been offered by some community members
[15:20] there are models that work. just saying it should be looked at
[15:20] i gotta run thank you all for your contributions
[15:20] <@wolfsokta> I feel we should explore the source for the website being hosted in GitHub and discuss in our next dev meeting.
[15:21] <@Chatturga> Thanks Under!
[15:21] == under [[email protected]/web/freenode/ip.72.200.168.56] has quit [Quit: Page closed]
[15:21] <@wolfsokta> Thanks, we also need to drop soon.
[15:21] There is no official site so why care. Someone will do better than the next if RVN is worth it anyway. That's already the case.
[15:21] <@wolfsokta> Let's do 10 mins of open Q&A
[15:22] <@wolfsokta> Go...
[15:23] <@Chatturga> Beuller?
[15:24] No questions ... just a comment that the devs and community are great and I'm happy to be a part of it
[15:24] I think everyone moved to discord. I'll throw this out there. How confident is the dev team that things will be ready for oct 31st?
[15:24] <@wolfsokta> Alright! Thanks everybody for joining us today. Let's plan to get back together as a dev group in a couple of weeks.
[15:25] thanks block!
[15:25] <@wolfsokta> Urgo, very confident
[15:25] Please exclude trolls from discord who havent read the whitepaper
[15:25] great :)
[15:25] "things" will be ready..
[15:25] Next time on discord right?
[15:25] woah why discord?
[15:25] some of the suggestions here are horrid
[15:25] this is better less point
[15:25] == blondfrogs [[email protected]/web/freenode/ip.185.245.87.219] has quit [Quit: Page closed]
[15:25] Assets are working well on testnet. Plan is to get as much as we can safely test by Sept 30 -- this includes dev contributions. Oct will be heavy testing and making sure it is safe.
[15:26] people
[15:26] <@wolfsokta> Planning on same time, same IRC channel.
[15:26] == BW_ [[email protected]/web/freenode/ip.138.68.243.202] has quit [Quit: Page closed]
[15:26] @xmr any in particular?
[15:27] (or is "here" discord?)
[15:27] Cheers - Tron
[15:27] "Cheers - Tron" - Tron
submitted by Chatturga to Ravencoin [link] [comments]

What is Bitcoin? Explained In 5 Minutes (2016 HD) - YouTube bitcoin otc Noob's Guide To Bitcoin Mining - Super Easy & Simple - YouTube How Does Bitcoin Work? - YouTube How Much Can You Make Mining Bitcoin With 6X 1080 Ti ...

All miners at any time must have a consensus on the "difficulty target" to be able to achieve the correct nonce ... I believe the actual code is in bitcoin/src/chain.h. Excerpt: New blocks will only be added to the block chain if their hash is at least as challenging as a difficulty value expected by the consensus protocol. Every 2,016 blocks, the network uses timestamps stored in each block ... Under the original difficulty adjustment paradigm, when adjusting the target, every Bitcoin client compared the actual time it took to generate 2016 blocks with the two week goal, and modified the target by the percentage difference. A single retarget never changes the target downwards (increasing difficulty) by more than a factor of 4, or upwards (decreasing difficulty) by more than a factor ... See also Difficulty. The target is a 256-bit number (extremely large) that all Bitcoin clients share. The SHA-256 hash of a block's header must be lower than or equal to the current target for the block to be accepted by the network. The lower the target, the more difficult it is to generate a block.. It's important to realize that block generation is not a long, set problem (like doing a ... Difficulty is a measure of how difficult it is to find a hash below a given target. The Bitcoin network has a global block difficulty. Valid blocks must have a hash below this target. Mining pools also have a pool-specific share difficulty setting a lower limit for shares. How often does the network difficulty change? Every 2016 blocks. The Bitcoin price is rising at a slightly lesser 0.3403% per day over the past year. We suggest you enter a custom Bitcoin price into our calculator based on what you expect the average price to be over the next year. The price has gone down for most of the past year, which is a factor that should be strongly considered in your calculations.

[index] [5319] [17786] [26158] [10677] [26830] [48904] [16212] [23484] [46828] [21665]

What is Bitcoin? Explained In 5 Minutes (2016 HD) - YouTube

The primary factor to choose when you will be preparing to purchase a bitcoin is set up a Bitcoin Wallet. This may be the spot the place you are going to shop the bitcoins that you have obtained ... bitcoin wallet bitcoin to usd bitcoin wiki bitcoin market bitcoin forum bitcoin difficulty buy bitcoin bitcoin central bitcoin faucet bitcoin pool earn bitcoins bitcoin mining pool bitcoin.org ... Is it realistically possible for #Bitcoin to 100x and replace GOLD? The one key factor most people overlook… Bitcoin comes to Facebook (sort of), Bakkt updates, Jed McCaleb says we are not in a ... what is bitcoin what is bitcoin mining what is bitcoin worth what is bitcoin backed by what is bitcoin payment what is bitcoin address what is bitcoin wallet... Thanks to Away for sponsoring this video! Go to https://www.awaytravel.com/techquickie and use promo code techquickie to get $20 off your next order! Bitcoin...

#