Open sourcing and decentralizing TezID Profiles

At the moment the project doesn’t look complete and usable.
Are you still working on this project?

1 Like

Hey! I have been dealing with some life events preventing me from working on it lately. But I most definitely am still working on it and plan to complete it as soon as possible.

Latest news is that I have been working on tooling downloading about 2GB of profile pictures that I plan on using to bootstrap the contract on mainnet.

2 Likes

You recommended TzCommunity. Is it correct link?
https://tezos-community.com/organizations

Frontend is not more hosted as Marigold is defunded.

You can run it yourself: GitHub - marigold-dev/tezos-community: Project to build a dapp for Tezos community that includes a DAO, organization rules, multisig, message broadcast, etc ...

So that application was not decentralized, and It could be mistake to use it :frowning:
This is the reason to have onchain decentralized profiles.

1 Like

The smart contract is still running off chain. It is just we don’t pay for the frontend anyone

So it is decentralized

This is why we need TCP hosted by many entities :blush::pray:

Hey, so a little update for everyone :blush:

I have finished updating the contracts to ligo 1.6.0. Deployed on ghostnet.

I have also added tooling to download profiles from tzprofiles and tezid. I have scraped a total of 43454 profiles that I intend to boostrap the contract with.

I have also added tooling to batch bootstrap the contract.

A new consideration: I was originally intending to have the ‘’ key point to a ‘ipfs://’ for the payloads. But the profile payloads are not that large, they only contain:

{ nic, pic, bio }

(other data like proofs, web2 links, etc. will go in other big_map keys so the main ‘’ key will not grow)

The cost of adding the bytes as stringified JSON directly to the big map is not that much more than adding the ipfs URI. 1000tez vs. 2500tez. And this would allow us to bypass an entire ipfs layer - we could drop ipfs for the basic profile data both for uploads and the indexer (it now has to download the payloads from ipfs). It also add complications for the indexer and API since different people will use different IPFS solutions.

Would that not be great & preferable? Any thoughts?

I was thinking to apply the the DAO to cover the bootstrap funds :sweat_smile:

4 Likes