What would be an easy way to simulate a double spend on a locally running ethereum blockchain? Could ganache help with this? Could someone please point in the right direction? We are trying to build a monitoring tool to detect double spend but don't know how to create a double spend in the first place.
the idea of double spend is quite simple you have to get two or more nodes in pow sync and start mining then use one node to spend your ether after that stop it from broadcasting changes to the network and get it back to the older block before you spend your ether now resume mining but give the node more power so it can mine faster then the other your attacker blockchain will become the longest chain now you can resume broadcasting this will be challenging but you can use the miner Api https://github.com/ethereum/go-ethereum/wiki/Management-APIs provided with Geth to manipulate mining and blockchain process and transactions.
Related
Gas costs for even a simple contract deployment are astronomical. If I spin up my own node and connect directly to it can I deploy on that node and avoid the gas fees?
Can I avoid gas fees by running my own Ethereum node?
Simple answer: No.
A node accepts the transaction, stores it in its mempool, and relays it to other nodes so that they can also store it in their mempool. There's nothing related to gas costs at this point.
Theoretically you could deploy a contract for "free" by running your own miner. But that's impossible without professional hardware costing very large amounts - many times more expensive than what you'd save on the gas fees.
However, in a block that you mined, you could include a transaction deploying a contract with 0 gas price. This sometimes happens - miners do occasionally put their own 0-priced transactions in their own blocks. But apart from that, no miner would accept your transaction with 0 gas price (or any transaction priced lower than the current market rate), as that wouldn't be profitable for them.
Compared against a traditional program how much more does it cost to run an equivalent EVM smart contract ?
I'm looking for some rough estimate like: the same program would cost 100x more in EVM smart contract than on a traditional server.
well, when we talk about smart contract you only pay gas, so if you want to know how much it will cost to you, you need to know the gas price of your target chain and the gas of deploy the contract and call the functions you need to setup the contract, and only that, you don't have to pay any kind of bill for that, just the gas you spend
I'm interested in the conceptual topic of creating rights managements systems on the the Ethereum block chain with digital assets represented by an NFT.
I am just reading up on how to write programs that run on Etherium but I have some very basic questions just to get to started.
I read that NFT are created on the Ethereum blockchain. I don't really understand if that is the same block chain on which the currency Ether is maintained? Seems like the ledger will become impossibly large huge if both the every currency transaction and every digital asset and copy thereof that migrates to Ethereum is stored in one single giant ledger and that each miner on the chain has to download the entire ledger to one single machine in order to validate transactions? Have I got big misunderstanding there? I know there is talk about "sharding" in the future, but it seems like that isn't coming very soon.
Cost of running a smart contract on the blockchain? Assuming that the we are talking about the same block chain, from what I can see the price of "Gas" is quite high. I'm reading that the price of ETH transfer from one party to another is 21,000 Gwei, about $0.03 today. Just trying to understand the basics, how much does it cost to create a NFT? And roughly how much does it cosst to execute a simple function on the blockchain (without loops). Let say the equivalent of 5 statement function which takes a few simple params, reads a few blocks, doesn't write to the block chain but just performs some simple math and a few if statements and returns a string? Does that also cost, like, more than penny? Is the conversion to ETH2 switch from proof of work to proof of stake going to bring those costs down by orders of magnitude?
Any good resources or reference on how to write programs which create and manipulate NFTS on Etherium? Most of what I have seen in the bookstores seem to cover financial transactions with Ether.
Yes, it's the same blockchain.
You can see in the stats that full node (stores current state) currently takes about 400 GB and archive node (stores current and historical states as well) takes about 6.6 TB.
My observation is that most web apps using blockchain data don't verify and trust a third-party service running a node (such as Infura). And I believe that most end users or businesses who want/need to verify, usually have the capacity to store 400+ GB and are able to scale.
But if this amount of data is okay or "impossibly large huge", I'll leave that to your decision. :)
Deployment of a token smart contract usually costs between 500k to 3M gas. My estimate is that most token contracts with basic features that were compiled with an optimizer, cost around 1M gas to deploy. With current prices of ~200 Gwei/gas and $1800/ETH, that's about $350. But I remember just few months ago the average gas prices were ~20 and ETH cost $500, so that would be around $10. So yea, the cost of deploying a contract is very volatile.
Simple function that performs validations and transformations in memory is going to cost the base 21k + few hundred gas. (Working with memory data is cheap gas-wise, accessing the storage is much more expensive.) So in current prices around $7, few months ago it could have been $0.25.
As for the question, whether ETH2.0 is going to bring lower gas price: My opinion is that L2 (which should be released earlier than PoS) is going to have some effect on the price since it allows for sidechain transactions (similar to Lightning network on Bitcoin). But this is a development forum, so I'm not not going to dive deeper into price speculations.
I recommend OpenZeppelin docs where they cover their opensource implementations of ERC standards (including ERC-721 NFTs) or googling the topic you're interested in and read articles that catch your eye (at least that's my current approach).
And if you're new to Solidity in general, I recommend at least few chapters from CryptoZombies tutorial. In my opinion, the first few chapters are great and you'll learn a lot, but then the quality slowly fades.
I was wondering how ethereum blockchain works compared with the bitcoin blockchain.
I know that, in bitcoin, all nodes compete to mine blocks (and put to public transaction into them and thus make bitcoin as transaction processing fee), and that all nodes compete for the next block at one time with an equal chance of mining it.
But in ethereum, where you want a network of distributed apps that get executed according to the gas price they are willing to pay (and starting gas), are all nodes competing for the next block at one given time? Wouldn't this be a waste of computation?
Yes, all the nodes do compete for (pretty much) the same blocks, and yes - they do execute all the code in a block, even if this block is not going to be successfully mined.
Don't think of it as "waste," but rather as a mechanism to ensure proof of work.
In short, yes, there is a lot of wasted computation.
Ethereum's mining process is almost the same as bitcoin’s.
For each block of transactions, miners will run the block’s unique header metadata (including timestamp and software version) through a hash function. If the miner finds a hash that matches the current target, the miner will be awarded ether and broadcast the block across the network for each node to validate and add to their own copy of the ledger. If miner B finds the hash, miner A will stop work on the current block and repeat the process for the next block.
I'm developing a dapp and got it working well using web3 and testrpc.
My frontend is currently pretty "chatty" with contract calls (constant methods) and everything works super fast.
I was wondering what kind of latency I should expect in the real network for simple calls? do I need to aggresively optimize my contract reads?
It depends. If your dApp is running on a node (and is fully synced), then constant functions will execute similar to what you're seeing in your testing. If not, then all bets are off. Your latency will depend on the provider you're connecting to.
My best advice is once you finish development, deploy to testnet and run performance tests. Chances are if you're not running a fully synced local node, and your app is as chatty as you say, then you may be disappointed with the results. You would want to look into optimizing your reads, moving some state data out of the contract (is possible), or turning your client into a light node.