GMGN Wallet Stat Scraper

GMGN Wallet Stat Scraper

GMGN Wallet Stat Scraper is an Apify actor that automatically extracts statistical data from crypto wallets on the GMGN.ai platform. Working across Ethereum, BSC, Base, Solana, Blast, and Tron networks, it collects and analyzes trading statistics, profit/loss ratios and performance data from wallets

DEVELOPER_TOOLSAUTOMATIONOTHERApify

GMGN Wallet Stat Scraper

GMGN Wallet Stat Scraper

GMGN Wallet Stat Scraper is a powerful web scraping tool developed for the Apify platform. This tool automatically extracts statistical data from crypto wallets on the GMGN.ai platform, analyzes it, and allows you to monitor it. Working across different blockchain networks (Ethereum, BSC, Base, Solana, Blast, and Tron), it collects comprehensive information such as trading statistics, profit/loss ratios, and performance indicators from a wallet.

Why Should You Use GMGN Wallet Stat Scraper?

GMGN Wallet Stat Scraper saves you hours by automating manual data collection processes and provides you with access to the most up-to-date wallet statistics. This tool offers you the following advantages:

  • Time Saving: Saves you hours by automating manual data collection processes
  • Comprehensive Data: Complete statistical data set including trading numbers, profit/loss ratios, and risk metrics
  • Multi-Blockchain Support: Works on Ethereum, BSC, Base, Solana, Blast, and Tron networks
  • Customizable Data Collection: Ability to collect statistical data over different time periods

Features

  • Extracts comprehensive statistical data from crypto wallets on GMGN.ai
  • Can scan multiple wallet addresses simultaneously
  • Can retrieve wallet statistics according to different time periods (1 day, 7 days, 30 days, all time)
  • Stores collected data in Apify data storage and allows you to export it in various formats (JSON, CSV, Excel)
  • Provides faster and more reliable results with proxy support
  • Automatic wallet address validation and conversion to appropriate format

Use Cases

  • Portfolio Performance Analysis: Analyze the overall performance and trading behaviors of wallets
  • Risk Assessment: Evaluate the risk profiles and investment strategies of wallets
  • Investor Behavior Study: Examine behavior models of successful investors
  • Market Research: Research the performance of crypto assets in different wallets
  • Data Science Projects: Create comprehensive statistical datasets for crypto markets

Usage

  1. Run this actor in the Apify console.
  2. Provide the required inputs:
    • walletAddresses: Wallet addresses to be scanned (you can enter multiple addresses)
    • chain: Blockchain network to be scanned (eth, bsc, base, sol, blast, tron)
    • period: Statistical time period (1d, 7d, 30d, all)
    • proxyConfiguration: Proxy configuration

Example Input

1{
2  "walletAddresses": ["0xd8da6bf26964af9d7eed9e03e53415d37aa96045"],
3  "chain": "eth",
4  "period": "all",
5  "proxyConfiguration": {
6    "useApifyProxy": true,
7    "apifyProxyGroups": []
8  }
9}

Output

The collected data is saved to the Apify dataset. The output data includes the following fields:

  • wallet_address: Wallet address
  • chain: Blockchain network
  • period: Statistical time period
  • buy: Total number of buys
  • buy_1d: Number of buys in 1 day
  • buy_7d: Number of buys in 7 days
  • buy_30d: Number of buys in 30 days
  • sell: Total number of sells
  • sell_1d: Number of sells in 1 day
  • sell_7d: Number of sells in 7 days
  • sell_30d: Number of sells in 30 days
  • pnl: Profit/Loss ratio
  • pnl_1d: 1-day Profit/Loss ratio
  • pnl_7d: 7-day Profit/Loss ratio
  • pnl_30d: 30-day Profit/Loss ratio
  • all_pnl: All-time Profit/Loss ratio
  • realized_profit: Realized profit
  • realized_profit_1d: 1-day realized profit
  • realized_profit_7d: 7-day realized profit
  • realized_profit_30d: 30-day realized profit
  • unrealized_profit: Unrealized profit
  • unrealized_pnl: Unrealized Profit/Loss ratio
  • total_profit: Total profit
  • total_profit_pnl: Total profit Profit/Loss ratio
  • balance: Token balance
  • eth_balance: ETH balance
  • sol_balance: SOL balance
  • trx_balance: TRX balance
  • bnb_balance: BNB balance
  • total_value: Total value
  • winrate: Win rate
  • token_sold_avg_profit: Average profit of sold tokens
  • history_bought_cost: Historical purchase cost
  • token_avg_cost: Average token cost
  • token_num: Number of tokens
  • profit_num: Number of profitable tokens
  • pnl_lt_minus_dot5_num: Number of tokens with PnL < -0.5
  • pnl_minus_dot5_0x_num: Number of tokens with PnL between -0.5 and 0x
  • pnl_lt_2x_num: Number of tokens with PnL < 2x
  • pnl_2x_5x_num: Number of tokens with PnL between 2x and 5x
  • pnl_gt_5x_num: Number of tokens with PnL > 5x
  • bind: Binding status
  • avatar: Avatar URL
  • name: Name
  • ens: ENS name
  • tags: Tags
  • tag_rank: Tag ranking
  • twitter_name: Twitter name
  • twitter_username: Twitter username
  • twitter_bind: Twitter binding status
  • twitter_fans_num: Number of Twitter followers
  • followers_count: Followers count
  • is_contract: Whether it is a contract
  • last_active_timestamp: Last active time
  • risk: Risk metrics
  • avg_holding_peroid: Average holding period
  • updated_at: Update time
  • refresh_requested_at: Refresh request time

Example Output

1{
2  "buy": 5,
3  "buy_1d": 0,
4  "buy_7d": 0,
5  "buy_30d": 0,
6  "sell": 78,
7  "sell_1d": 2,
8  "sell_7d": 2,
9  "sell_30d": 3,
10  "pnl": -0.0002872711899513403,
11  "pnl_1d": 0,
12  "pnl_7d": 0,
13  "pnl_30d": 0,
14  "all_pnl": -0.012391918863559689,
15  "realized_profit": 0,
16  "realized_profit_1d": 0,
17  "realized_profit_7d": 0,
18  "realized_profit_30d": 0,
19  "unrealized_profit": -1132.7714693194157,
20  "unrealized_pnl": -0.0009643805337237632,
21  "total_profit": -2256.4155123647643,
22  "total_profit_pnl": -0.02441492602232377,
23  "balance": "521.632140040817202251",
24  "eth_balance": "521.632140040817202251",
25  "sol_balance": "521.632140040817202251",
26  "trx_balance": "521.632140040817202251",
27  "bnb_balance": "521.632140040817202251",
28  "total_value": 19087194637518.703,
29  "winrate": 0,
30  "token_sold_avg_profit": -32.10411551558139,
31  "history_bought_cost": 1174610.4672451697,
32  "token_avg_cost": 391536.8224150566,
33  "token_num": 35,
34  "profit_num": 0,
35  "pnl_lt_minus_dot5_num": 0,
36  "pnl_minus_dot5_0x_num": 2,
37  "pnl_lt_2x_num": 33,
38  "pnl_2x_5x_num": 0,
39  "pnl_gt_5x_num": 0,
40  "bind": false,
41  "avatar": "https://pbs.twimg.com/profile_images/1880759276169224192/rXpjZO0A_400x400.jpg",
42  "name": "vitalik.eth",
43  "ens": "tipsforcoins.eth",
44  "tags": [
45    "kol",
46    "bluechip_owner"
47  ],
48  "tag_rank": {
49    "bluechip_owner": 0,
50    "kol": 139
51  },
52  "twitter_name": "vitalik.eth",
53  "twitter_username": "VitalikButerin",
54  "twitter_bind": false,
55  "twitter_fans_num": 5734536,
56  "followers_count": 5734536,
57  "is_contract": false,
58  "last_active_timestamp": 1743451607,
59  "risk": {
60    "token_active": 35,
61    "token_honeypot": 0,
62    "token_honeypot_ratio": 0,
63    "no_buy_hold": 7422,
64    "no_buy_hold_ratio": 0.995306423494703,
65    "sell_pass_buy": 33,
66    "sell_pass_buy_ratio": 0.9428571428571428,
67    "fast_tx": 1,
68    "fast_tx_ratio": 0.02857142857142857
69  },
70  "avg_holding_peroid": 0,
71  "updated_at": 1743458941,
72  "refresh_requested_at": null,
73  "wallet_address": "0xd8da6bf26964af9d7eed9e03e53415d37aa96045",
74  "chain": "eth",
75  "period": "all"
76}

This example output shows the statistical data for a single wallet. The actual output will be a list of similar objects for all processed wallets.

Notes

  • The collected data is stored in Apify's default data store.

Frequently Asked Questions

Is it legal to scrape job listings or public data?

Yes, if you're scraping publicly available data for personal or internal use. Always review Websute's Terms of Service before large-scale use or redistribution.

Do I need to code to use this scraper?

No. This is a no-code tool — just enter a job title, location, and run the scraper directly from your dashboard or Apify actor page.

What data does it extract?

It extracts job titles, companies, salaries (if available), descriptions, locations, and post dates. You can export all of it to Excel or JSON.

Can I scrape multiple pages or filter by location?

Yes, you can scrape multiple pages and refine by job title, location, keyword, or more depending on the input settings you use.

How do I get started?

You can use the Try Now button on this page to go to the scraper. You’ll be guided to input a search term and get structured results. No setup needed!