Home pageAI Alignment Problem
AI Alignment Problem

AI Alignment Problem

0x24c96b1a6602d177c162f7085d397715009f348c
Minted 1y ago
24 tokens
10% royalty
Paul previously ran the language model alignment team at OpenAI, the creators of ChatGPT. How now runs the Alignment Research Center, a non-profit research organization whose mission is to align future machine learning systems with human interests. We’re on a mission to calm our fears on AI. This episode explores the solution-landscape to the AI Alignment problem, and hoping Paul can guide us on that journey. Does humanity have a chance? Artwork generated from when Paul states “The most likely way we die involves not from AI comes out of the blue and kills us everyone, but involves we have deployed a lot of AI everywhere. And you can just look and be like oh ya if for some reason, god ...
Mkt cap
Min price
24h Avg price
Zero-royalty trades
Wash trading
24h Volume
Total volume
24h Sales
Total sales
Mint price
Sales
0 sales
Critical Risk
Overall Score
Critical Risk
Critical
Critical Risks
4
Medium
Medium Risks
2
Low risks
Low Risks
6

Details

Metadata Storage
Metadata Storage
Centralized storage
Artifact Storage
Artifact Storage
Centralized storage
Collection Sales
Collection Sales
0 sales per last month
Editable Metadata
Editable Metadata
Editable metadata stored on a server
Creator Royalty
Creator Royalty
10%
Unique Owners
Unique Owners
Collection tokens24 tokens
Unique Owners0 / 0%
External Contract
External Contract
Custom contract
Token Type
Token Type
ERC-721
Past Owners
Past Owners
0 of 0 owners is suspicious
Suspicious Sales
Suspicious Sales
0 potential sales
Missed Royalty Payments
Missed Royalty Payments
0% missed royalty fee
Copymints
Copymints
The original collection.
Found 0 similar collections

Sales by marketplace

All

Wash trading volume by marketplace

All

Profit & Loss by token

Tokens bought and sold by the same address.

No data