Week 0:
Task: Migrate API from GAE to AWS Elastic Beanstalk
Day 1: Tried running the API on local machine but could not due to incompatibility of Pylibmc with Windows. Tried directly to deploy it to AWS EB.
Checked app.yaml from GAE to check Python version, environment variables. While creating application on EB, used the version specified in YAML file (that is 2.7), saved the environment variables in .env file. Tried deploying but Pylibmc caused multiple errors. Searching on StackOverflow and checking logs gave that gcc
and libmemcached
had to be installed as well. Changed the .elasticbeanstalk/config.yml
to install gcc and libmemcached
Day2: Changed the version of pylibmc
to the latest and deployed, the deployment was successful. But the EB WSGI still wants the Flask app to be named application so renamed and deployed it again. Deployment successful.
Used this SO answer to create a config file.
Deployment successful! Works fine! Though there was a MemoryError at first (traceback) but it worked fine on the next run.
Changes Made:
1. Installed the latest version of pylibmc
(Edited requirements.txt)
2. Renamed main.py
to application.py
and fixed wherever it was used.
3. Renamed the flask application from app to application. Both of the above were done for compatibility with EB WSGI settings.
4. Used this SO answer to add WSGI configuration.
The earlier issue with pylibmc
was due to incompatibility with the gcc
version being used. On searching the web, I found out that the pylibmc
requires the same version of gcc
for which it was built. Since I could not find out the version of gcc
for the older version of the library, so I installed the latest version of both and that worked. And since mainly only the get and set functions were being used of the given library, there was no compatibility issue with the code.
Duplicating data on Firebase
Another work that had to be done was to duplicate realtime data from a firebase account to my own account for demo.
- In case the firebase data has to be duplicated, export the Realtime database of the source account. Then import the JSON to the Realtime database in the destination account. You cannot import file of size greater than 256 MB using this method.
- Another option is to first take a backup of the Realtime database of the source account in a storage bucket. Then download it on a cloud VM (configure permissions) and use a streaming import tool like this Python utility or this nodejs package
Week 1
Week 2
Day 4: Looked at use cases of Reddit Vault, augur.net and numer.ai. Also looked at some services like fortmatic to create Ethereum wallets.
Day 5: Involved looking at Swift libraries that can be used to create and work with Ethereum wallets. Looked at a few libraries including web3.swift, EthereumKit, and EtherWalletKit. After doing a feature comparison, web3.swift was best suited for our work.
Week 3
Task: Create an Ethereum smart contract for staking custom tokens
Day 1: Created a simple Ethereum smart contract for Gulpie’s use case of staking. Dapp University‘s videos and solidity’s documentation helped a lot.
Day 2: Made the previous smart contract better by implementing better functions, and a better approach to staking.
Day 3: Searched for Python libraries to be used to interact with smart contract from the API. Looked at web3.py and brownie. Brownie seemed to be easier to implement and thus started using it. Again, Dapp University’s videos with web3.py helped again to understand what was going on. Got into trouble as the custom ERC20 tokens weren’t being transferred between accounts.
Day 4: Integrated the smart contract with the API using Python’s Brownie.
Day 5: Changed the Reward function of the staking contract. My method of rewarding was to reward all the stakeholders after any of them claimed his/her rewards. This was inspired from this SE answer. Though they did not recommend what I was doing, it was a bit similar.
This approach was changed after my mentor asked me to look at the user side as well, where it would be difficult for the user to understand what exactly is happening behind the scenes. Also looked at how Augur does this, and they reward each stakeholder after they claim his/her rewards. Implemented the same method in the smart contract.