About the company
Persona is a blockchain-based customer intelligence platform specializing in Dapp analytics using AI modeling on on-chain and off-chain data.
In today’s world, Dapps do not understand their customers well, leading to poor or improper customer experience and low retention. Further, blockchain data is difficult to access; the anonymity around wallets makes customer interpretation even harder.
We are on a mission to help businesses understand their user behavior to aid in better outreach and engagement. To achieve this, we are backed and trusted by top-tier VCs. We are looking for highly resourceful individuals who understand the needs of our customers and are passionate about building world-class systems that challenge the status quo.
About the job
In this role, you will be part of the founding team and build state-of-the-art systems to collect and aggregate on-chain and off-chain data consumed by AI models to derive useful customer insights. Compensation will be cash + equity.
You will work directly with the co-founding team, who are ex-Amazon, LinkedIn, Wharton, Berkeley, IIT with experience building and scaling large scale data & AI systems at some of the most customer-centric companies in the world. Together, we aspire to bring the same customer-centricity to web3 to help onboard the next billion!
- Design, develop and support data pipelines to build wallet, nft and dapp data tables.
- Develop SQL/GraphQL queries to draw data insights
- Build wallet segments, nft similarity models, dapp similarity models using AI models.
- Build production-grade APIs on the in-house data tables.
- Collaborate and influence stakeholders and support engineers to ensure our data infrastructure meets constantly evolving requirements
- Work closely with product engineers to create, test, and maintain data models
- Contribute to engineering efforts from planning and organization to execution and delivery to solving complex engineering problems
- Take initiative and be responsible for technical solutions to data quality and workflow challenges
- Write and review technical documents, including design, development, and revision documents
Are you the right person for this role?
- Experience with building data pipelines and AI models
- Advanced knowledge of modern data pipeline architecture and the AWS/GCP ecosystem
- Experience in performing root cause analysis on data logging and ingestion processes, identifying opportunities to improve instrumentation and observability
- Experience designing data models and data warehouses
- Experience with APIs
- Experience with GraphQL, Airflow
- Self-motivated and passionate about bringing efficiency into the system through optimizations.
- Able to set the standard for other engineers by proposing and driving innovative ideas.
- Experience working with web3 projects, Dapps
- Understand web3 ecosystem, smart contracts, NFTs, tokens
- Experience working with Kafka
- Experience with Data Visualization tools
* If interested send your resume to firstname.lastname@example.org