Compare commits

..

No commits in common. "main" and "0.1.6" have entirely different histories.
main ... 0.1.6

26 changed files with 1147 additions and 1428 deletions

View File

@ -1,13 +0,0 @@
[settings]
PORT=5555
SERVER_URL=localhost
SERVER_CONNECTION_TYPE=proxy
CA_KEY=ca.key
CA_CERT=ca.crt
CERT_KEY=cert.key
CERT_DIR=certs/
#OPENSSL_BINPATH=openssl
CLIENT_ENCODING=utf-8
USE_EXTENSIONS=wayback.Wayback,bio.PyBio,alwaysonline.AlwaysOnline
ES_HOST=http://127.0.0.1:9200
ES_INDEX=alwaysonline

8
.github/FUNDING.yml vendored
View File

@ -1,2 +1,8 @@
# These are supported funding model platforms
github: gnh1201
custom: ['https://gnh1201.link']
open_collective: welsonjs
liberapay: catswords
custom: ['https://www.buymeacoffee.com/catswords', 'https://toss.me/catswords']
patreon: catswords # Replace with a single Patreon username
ko_fi: catswords

13
.github/workflows/lint.yml vendored Normal file
View File

@ -0,0 +1,13 @@
name: Ruff
on: [push, pull_request]
concurrency:
group: ${{ github.workflow }}-${{ github.event_name == 'pull_request' && github.head_ref || github.ref }}
cancel-in-progress: true
jobs:
ruff:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v4
- uses: chartboost/ruff-action@v1

View File

@ -1,23 +0,0 @@
name: AI Code Review
on:
pull_request:
types: [opened, synchronize, reopened]
issues:
types: [opened, reopened]
jobs:
repofix:
runs-on: ubuntu-latest
steps:
- name: Run RepoFixAI
uses: Manav916/llm-code-review@main
with:
groq_api_key: ${{ secrets.GROQ_API_KEY }}
groq_model: 'gemma-2-9b-it'
github_token: ${{ secrets.GITHUB_TOKEN }}
# exclude_extensions: 'txt'
repo_owner: ${{ github.repository_owner }}
repo_name: ${{ github.event.repository.name }}
event_number: ${{ github.event.number || github.event.issue.number }} # when listening for both pull requests and issues
event_name: ${{ github.event_name }}

3
.gitmodules vendored
View File

@ -1,3 +0,0 @@
[submodule "plugins"]
path = plugins
url = https://github.com/gnh1201/caterpillar-plugins

View File

@ -1,43 +1,24 @@
# Caterpillar Proxy (Songchoongi Project)
# gnh1201/caterpillar
Caterpillar Proxy - The simple web debugging proxy (formerly, php-httpproxy)
[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fgnh1201%2Fcaterpillar.svg?type=shield)](https://app.fossa.com/projects/git%2Bgithub.com%2Fgnh1201%2Fcaterpillar?ref=badge_shield)
[![DOI 10.5281/zenodo.13346533](https://zenodo.org/badge/DOI/10.5281/zenodo.13346533.svg)](https://doi.org/10.5281/zenodo.13346533)
[![ChatGPT available](https://img.shields.io/badge/ChatGPT-74aa9c?logo=openai&logoColor=white)](#)
[![slideshare.net available](https://img.shields.io/badge/SlideShare-black?logo=slideshare)](https://www.slideshare.net/slideshow/2024-caterpillar-project-in-2024-korea-oss-contest/273031732)
[![Discord chat](https://img.shields.io/discord/359930650330923008?logo=discord)](https://discord.gg/9VVTHpfsVW)
[![Open to work](https://img.shields.io/badge/%23-OPENTOWORK-green)](https://github.com/gnh1201/welsonjs/discussions/167)
Caterpillar Proxy (Songchoongi Project) - The simple web debugging proxy (formerly, php-httpproxy)
![A cover image: Caterpillar on a tree looking at a rocket flying over the clouds](assets/img/cover.png)
You can connect all physical and logical channels with communication capabilities to the web!
Imagine various means such as time machines, satellites, quantum technology, sound, light, the Philosopher's Stone, or Excalibur, just like in science fiction movies! Caterpillar Proxy supports the implementation of extensions for Connectors, Filters, and RPC methods to bring your imagination to life.
:rocket: [Open the Caterpillar Proxy Web Console](https://pub-1a7a176eea68479cb5423e44273657ad.r2.dev/console.html)
![Cover image - Caterpillar on a tree looking at a rocket flying over the clouds](assets/img/cover.png)
## Use cases
* [Build a network tunnel using Python and the LAMP(PHP) stack (qiita.com)](https://qiita.com/gnh1201/items/40f9350ca6d308def6d4)
* [K-Anonymity for Spam Filtering: Case with Mastodon, and Misskey (qiita.com)](https://qiita.com/gnh1201/items/09f4081f84610db3a9d3)
* [File Upload Vulnerability Attack Test (Caterpillar Proxy) (youtu.be) ](https://youtu.be/sPZOCgYtLRw)
* [Real-time processing of emergency disaster sensor data (e.g., fire detection).](https://catswords.social/@catswords_oss/114016647285923011)
* [Build a network tunnel using Python and the LAMP(PHP) stack.](https://qiita.com/gnh1201/items/40f9350ca6d308def6d4)
* [K-Anonymity for Spam Filtering: Case with Mastodon, and Misskey](https://qiita.com/gnh1201/items/09f4081f84610db3a9d3)
* [[YouTube Video] File Upload Vulnerability Attack Test (Caterpillar Proxy)](https://youtu.be/sPZOCgYtLRw)
## How it works
### Basic structure
```
* You <-> Proxy client (Python) <-> Parasitized proxy server (Optional, PHP/LAMP) <-> On the Web
* You <-> Proxy client (Python) <-> Connector extensions (Optional, Python) <-> On the Web
You <-> Proxy client (Python) <-> Parasitized proxy server (Optional, PHP/LAMP) <-> On the Web
```
For example, build a simple web debugging proxy on the shared servers.
### Stateful mode
This project supports two modes of connection. The default is stateless. You can use the stateful mode to avoid being constrained by transfer capacity limits. See the [Stateful mode (catswords-oss.rdbl.io)](https://catswords-oss.rdbl.io/1155378128/5211324242).
### Connector extensions
This project supports the implementation of Connector extensions. The provided basic examples include implementations of web archives (caches) and serial communication as Connector extensions. Go to the [caterpillar-plugins repository (github.com)](https://github.com/gnh1201/caterpillar-plugins)
This project supports two modes of connection. The default is stateless. You can use the stateful mode to avoid being constrained by transfer capacity limits. See the [Stateful mode (github.com/gnh1201/caterpillar wiki)](https://github.com/gnh1201/caterpillar/wiki/Stateful-mode).
## (Optional) Before to use
If you have a server that ***will be parasitized*** and you want to proxy it, you should upload the `index.php` file to a shared server. The index.php file is located in the `assets/php` directory within this repository.
@ -47,7 +28,6 @@ If you have a server that ***will be parasitized*** and you want to proxy it, yo
```
[settings]
CONNECTION_TIMEOUT=1
PORT=5555
SERVER_URL=localhost
SERVER_CONNECTION_TYPE=
@ -81,9 +61,9 @@ sudo update-ca-certificates
4. (Optional) With [Cloudflare](https://cloudflare.com), we can expect to accelerate the 4x speed and reduce the network stuck.
## Extensions
* [Web Console](https://pub-1a7a176eea68479cb5423e44273657ad.r2.dev/console.html)
* Fediverse (e.g., Mastodon): See the [Fediverse (catswords-oss.rdbl.io)](https://catswords-oss.rdbl.io/1155378128/3821602484).
* Wayback (Private browsing with Google or Wayback cache): See the [Wayback (catswords-oss.rdbl.io)](https://catswords-oss.rdbl.io/1155378128/6994492654)
* [Web Console Available](https://pub-1a7a176eea68479cb5423e44273657ad.r2.dev/console.html)
* Fediverse (e.g., Mastodon): See the [Fediverse (github.com/gnh1201/caterpillar wiki)](https://github.com/gnh1201/caterpillar/wiki/Fediverse).
* Wayback (Private browsing with Google or Wayback cache): See the [Wayback (github.com/gnh1201/caterpillar wiki)](https://github.com/gnh1201/caterpillar/wiki/Wayback).
## Thanks to
* Pan Art by [@yeohangdang@i.peacht.art](#): [Image File](assets/img/logo.png)
@ -98,18 +78,6 @@ sudo update-ca-certificates
![Roadmap image](assets/img/roadmap.png)
## Report abuse
- abuse@catswords.net
- [GitHub Security Advisories (gnh1201/caterpillar)](https://github.com/gnh1201/caterpillar/security)
## Join the community
- ActivityPub [@catswords_oss@catswords.social](https://catswords.social/@catswords_oss)
- XMPP [catswords@conference.omemo.id](xmpp:catswords@conference.omemo.id?join)
- [Join Catswords OSS on Microsoft Teams (teams.live.com)](https://teams.live.com/l/community/FEACHncAhq8ldnojAI)
- [Join Catswords OSS #caterpillar on Discord (discord.gg)](https://discord.gg/9VVTHpfsVW)
## Special channels
- [A paid consultation channel (m.expert.naver.com)](https://m.expert.naver.com/mobile/expert/product/detail?storeId=100051156&productId=100144540) is available for Korean (한국어) region.
- [Join the private operations channel (forms.gle)](https://forms.gle/ZKAAaGTiGamksHoo8) is available for all regions.
## License
[![FOSSA Status](https://app.fossa.com/api/projects/git%2Bgithub.com%2Fgnh1201%2Fcaterpillar.svg?type=large)](https://app.fossa.com/projects/git%2Bgithub.com%2Fgnh1201%2Fcaterpillar?ref=badge_large)
* ActivityPub [@gnh1201@catswords.social](https://catswords.social/@gnh1201)
* abuse@catswords.net
* [Join Catswords on Microsoft Teams](https://teams.live.com/l/community/FEACHncAhq8ldnojAI)

View File

@ -1,61 +0,0 @@
<?php
// https://github.com/dimamedia/PHP-Simple-TOTP-and-PubKey
class tfa {
// RFC4648 Base32 alphabet
private $alphabet = "ABCDEFGHIJKLMNOPQRSTUVWXYZ234567";
function getOtp($key) {
/* Base32 decoder */
// Remove spaces from the given public key and converting to an array
$key = str_split(str_replace(" ","",$key));
$n = 0;
$j = 0;
$binary_key = "";
// Decode public key's each character to base32 and save into binary chunks
foreach($key as $char) {
$n = $n << 5;
$n = $n + stripos($this->alphabet, $char);
$j += 5;
if($j >= 8) {
$j -= 8;
$binary_key .= chr(($n & (0xFF << $j)) >> $j);
}
}
/* End of Base32 decoder */
// current unix time 30sec period as binary
$binary_timestamp = pack('N*', 0) . pack('N*', floor(microtime(true)/30));
// generate keyed hash
$hash = hash_hmac('sha1', $binary_timestamp, $binary_key, true);
// generate otp from hash
$offset = ord($hash[19]) & 0xf;
$otp = (
((ord($hash[$offset+0]) & 0x7f) << 24 ) |
((ord($hash[$offset+1]) & 0xff) << 16 ) |
((ord($hash[$offset+2]) & 0xff) << 8 ) |
(ord($hash[$offset+3]) & 0xff)
) % pow(10, 6);
return $otp;
}
function getPubKey() {
$alphabet = str_split($this->alphabet);
$key = '';
// generate 16 chars public key from Base32 alphabet
for ($i = 0; $i < 16; $i++) $key .= $alphabet[mt_rand(0,31)];
// split into 4x4 chunks for easy reading
return implode(" ", str_split($key, 4));
}
}
?>

View File

@ -1,70 +0,0 @@
<?php
// coupang.class.php
// Coupang Product Search API integration class
// Namhyeon Go <gnh1201@gmail.com>
// https://github.com/gnh1201/welsonjs
//
date_default_timezone_set("GMT+0");
class CoupangProductSearch {
private $accessKey = "";
private $secretKey = "";
private $baseUrl = "https://api-gateway.coupang.com";
private function generateSignature($method, $path, $query = "") {
$datetime = (new \DateTime("now", new \DateTimeZone("GMT")))->format("ymd\THis\Z");
$message = $datetime . $method . $path . $query;
$signature = hash_hmac('sha256', $message, $this->secretKey);
return [
'authorization' => "CEA algorithm=HmacSHA256, access-key={$this->accessKey}, signed-date={$datetime}, signature={$signature}",
'datetime' => $datetime
];
}
public function searchProducts($keyword, $limit = 10, $subId = null, $imageSize = null, $srpLinkOnly = false) {
$path = "/v2/providers/affiliate_open_api/apis/openapi/products/search";
$queryParams = http_build_query([
'keyword' => $keyword,
'limit' => $limit,
'subId' => $subId,
'imageSize' => $imageSize,
'srpLinkOnly' => $srpLinkOnly
]);
$fullPath = $path . '?' . $queryParams;
$url = $this->baseUrl . $fullPath;
$signatureData = $this->generateSignature("GET", $path, $queryParams);
$authorization = $signatureData['authorization'];
$datetime = $signatureData['datetime'];
$headers = [
"Content-Type: application/json;charset=UTF-8",
"Authorization: $authorization"
];
$curl = curl_init();
curl_setopt($curl, CURLOPT_URL, $url);
curl_setopt($curl, CURLOPT_CUSTOMREQUEST, "GET");
curl_setopt($curl, CURLOPT_HTTPHEADER, $headers);
curl_setopt($curl, CURLOPT_RETURNTRANSFER, true);
$response = curl_exec($curl);
$httpCode = curl_getinfo($curl, CURLINFO_HTTP_CODE);
curl_close($curl);
if ($httpCode === 200) {
return json_decode($response, true);
} else {
try {
return json_decode($response, true);
} catch (Exception $e) {
return [
"status" => $httpCode,
"message" => $e->getMessage()
];
}
}
}
}

View File

@ -1,72 +1,37 @@
<?php
/* index.php
* Caterpillar Proxy Worker on PHP runtime
* Caterpillar Worker on PHP
*
* Caterpillar Proxy - The simple web debugging proxy (formerly, php-httpproxy)
* Namhyeon Go (Catswords Research) <abuse@catswords.net>
* https://github.com/gnh1201/caterpillar
* Created at: 2022-10-06
* Updated at: 2025-03-11
* Updated at: 2024-07-15
*/
define("PERF_START_TIME", microtime(true));
define("PHP_HTTPPROXY_VERSION", "0.1.6.10");
define("PHP_HTTPPROXY_VERSION", "0.1.5.24");
define("DEFAULT_SOCKET_TIMEOUT", 1);
define("STATEFUL_SOCKET_TIMEOUT", 30);
define("MAX_EXECUTION_TIME", 0);
define("ALLOW_INVOKE_INSECURE_METHOD", false);
define("ALLOW_LOAD_INSECURE_SCRIPT", true);
define("DEFAULT_USER_AGENT", 'php-httpproxy/' . PHP_HTTPPROXY_VERSION . ' (Server; PHP ' . phpversion() . '; Caterpillar Proxy)');
define("RELAY_ALLOW_METHODS", ""); // e.g., GET,POST
define("RELAY_PROXY_PASS", ""); // e.g., https://example.org
define("RELAY_IMAGE_FILE_EXTENSIONS", ".png,.gif,.jpg");
define("RELAY_STATIC_FILE_EXTENSIONS", ".js,.css");
define("RELAY_ENABLE_JS_REDIRECT", false);
error_reporting(E_ALL);
ini_set("display_errors", 0);
ini_set("default_socket_timeout", DEFAULT_SOCKET_TIMEOUT); // must be. because of `feof()` works
ini_set("max_execution_time", MAX_EXECUTION_TIME);
define("DEFAULT_USER_AGENT", $_SERVER['HTTP_USER_AGENT'] . '</p><hr><p>php-httpproxy/' . PHP_HTTPPROXY_VERSION . ' (Server; PHP ' . phpversion() . '; Caterpillar; abuse@catswords.net)');
header('Access-Control-Allow-Origin: *');
header('Access-Control-Allow-Methods: *');
header("Access-Control-Allow-Headers: *");
function get_current_execution_time() {
$end_time = microtime(true);
return $end_time - PERF_START_TIME;
if (strpos($_SERVER['HTTP_USER_AGENT'], "php-httpproxy/") !== 0 && strpos($_SERVER['HTTP_X_USER_AGENT'], "php-httpproxy/") !== 0) {
exit('<!DOCTYPE html><html><head><title>It works!</title><meta charset="utf-8"></head><body><h1>It works!</h1><p><a href="https://github.com/gnh1201/caterpillar">Download the client</a></p><p>' . DEFAULT_USER_AGENT . '</p></body></html>');
}
function array_get($key, $arr, $default = null) {
return array_key_exists($key, $arr) ? $arr[$key] : $default;
}
function server_env_get($key) {
return array_get($key, $_SERVER, "");
}
function verity_integrity($data, $integrity) {
if (strpos($integrity, 'sha384-') !== 0) {
return false;
}
$encoded_hash = substr($integrity, 7);
$decoded_hash = base64_decode($encoded_hash);
$calculated_hash = hash('sha384', $data, true);
return hash_equals($calculated_hash, $decoded_hash);
}
function cast_to_array($data) {
return is_array($data) ? $data : array($data);
}
ini_set("default_socket_timeout", DEFAULT_SOCKET_TIMEOUT); // must be. because of `feof()` works
ini_set("max_execution_time", MAX_EXECUTION_TIME);
function jsonrpc2_encode($method, $params, $id = '') {
$data = array(
"jsonrpc" => "2.0",
"method" => $method,
"params" => $params,
"id" => $id,
"_execution_time" => get_current_execution_time()
"id" => $id
);
return json_encode($data);
}
@ -75,8 +40,7 @@ function jsonrpc2_result_encode($result, $id = '') {
$data = array(
"jsonrpc" => "2.0",
"result" => $result,
"id" => $id,
"_execution_time" => get_current_execution_time()
"id" => $id
);
return json_encode($data);
}
@ -85,8 +49,7 @@ function jsonrpc2_error_encode($error, $id = '') {
$data = array(
"jsonrpc" => "2.0",
"error" => $error,
"id" => $id,
"_execution_time" => get_current_execution_time()
"id" => $id
);
return json_encode($data);
}
@ -108,7 +71,7 @@ function fatal_handler() {
$errstr = $error["message"];
header("HTTP/1.1 200 OK");
exit("\r\n\r\n" . jsonrpc2_error_encode(array(
exit(jsonrpc2_error_encode(array(
"status" => 503,
"code" => $errno,
"message"=> "Error occurred in file '$errfile' at line $errline: $errstr"
@ -117,27 +80,6 @@ function fatal_handler() {
}
register_shutdown_function("fatal_handler");
function load_script($data) {
$loaded_script = false;
if (!ALLOW_LOAD_INSECURE_SCRIPT) {
return $loaded_script;
}
$fh = tmpfile();
if ($fh !== false) {
if (!(strpos($data, "<?") !== false)) {
$data = "<?php\r\n\r\n" . $data . "\r\n\r\n?>";
}
fwrite($fh, $data);
$path = stream_get_meta_data($fh)['uri'];
$loaded_script = include($path);
fclose($fh);
}
return $loaded_script;
}
// https://stackoverflow.com/questions/16934409/curl-as-proxy-deal-with-https-connect-method
// https://stackoverflow.com/questions/12433958/how-to-parse-response-headers-in-php
function parse_headers($str) { // Parses HTTP headers into an array
@ -276,12 +218,12 @@ function relay_connect($params, $id = '') {
}
function relay_mysql_connect($params) {
$hostname = array_get("hostname", $params, "localhost");
$username = array_get("username", $params, "root");
$password = array_get("password", $params, "");
$database = array_get("database", $params, null);
$port = intval(array_get("port", $params, 3306));
$charset = array_get("charset", $params, "utf8");
$hostname = $params['hostname'];
$username = $params['username'];
$password = $params['password'];
$database = array_key_exists('database', $params) ? $params['database'] : null;
$port = array_key_exists('port', $params) ? intval($params['port']) : 3306;
$charset = array_key_exists('charset', $params) ? $params['charset'] : "utf8";
try {
$mysqli = new mysqli($hostname, $username, $password, $database, $port);
@ -347,15 +289,7 @@ function relay_mysql_query($params, $mysqli) {
case "show":
case "select":
$success = true;
if (function_exists("mysqli_fetch_all")) {
$result['data'] = mysqli_fetch_all($query_result, MYSQLI_ASSOC);
} else {
$data = array();
while ($row = $query_result->fetch_assoc()) {
$data[] = $row;
}
$result['data'] = $data;
}
$result['data'] = mysqli_fetch_all($query_result, MYSQLI_ASSOC);
break;
case "insert":
@ -423,24 +357,6 @@ function relay_get_phpversion() {
);
}
function relay_get_env_hash() {
$params = array(
"php_version" => phpversion(),
"php_os" => PHP_OS,
"php_sapi" => PHP_SAPI,
"loaded_extensions" => get_loaded_extensions(),
"ini_settings" => ini_get_all(null, false)
);
$serialized_params = serialize($params);
return array(
"data" => array(
sha1($serialized_params),
md5($serialized_params)
)
);
}
function relay_get_loaded_extensions() {
return array(
"data" => get_loaded_extensions()
@ -473,85 +389,15 @@ function relay_dns_get_record($params) {
function relay_fetch_url($params) {
$url = $params['url'];
$method = array_get("method", $params, "GET");
$headers = array_get("headers", $params, array());
$data = array_get("data", $params, '');
// from local source
$local_prefix = "file:";
$pos = strpos($url, $local_prefix);
if ($pos !== false && $pos === 0) {
$path = realpath(substr($url, strlen($local_prefix)));
$basedir = realpath(__DIR__);
if ($path && strpos($path, $basedir) === 0) {
if (file_exists($path)) {
$response = file_get_contents($path);
return array(
"success" => true,
"result" => array(
"status" => 200,
"data" => $response
)
);
} else {
return array(
"success" => false,
"error" => array(
"status" => 404,
"code" => -1,
"message" => "Not found"
)
);
}
} else {
return array(
"success" => false,
"error" => array(
"status" => 403,
"code" => -1,
"message" => "Access denied"
)
);
}
}
// from remote source
$_headers = array();
if (is_array($headers) && count($headers) > 0) {
foreach ($headers as $header_line) {
$pos = strpos($header_line, ':');
if ($pos !== false) {
$header_key = trim(substr($header_line, 0, $pos));
$header_value = trim(substr($header_line, $pos + 1));
$_header_line = sprintf("%s: %s", $header_key, $header_value);
array_push($_headers, $_header_line);
}
}
}
try {
$ch = curl_init();
curl_setopt($ch, CURLOPT_URL, $url);
curl_setopt($ch, CURLOPT_USERAGENT, DEFAULT_USER_AGENT);
curl_setopt($ch, CURLOPT_RETURNTRANSFER, true);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 30);
curl_setopt($ch, CURLOPT_CONNECTTIMEOUT, 1);
curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);
curl_setopt($ch, CURLOPT_DNS_USE_GLOBAL_CACHE, false);
curl_setopt($ch, CURLOPT_DNS_CACHE_TIMEOUT, 30);
// check the request headers
if (count($_headers) > 0) {
curl_setopt($ch, CURLOPT_HTTPHEADER, $_headers);
}
// check it is POST request
if ($method == "POST") {
curl_setopt($ch, CURLOPT_POSTFIELDS, cast_to_array($data));
curl_setopt($ch, CURLOPT_POST, true);
}
// make cURL instance
$response = curl_exec($ch);
$error_code = curl_errno($ch);
if ($error_code) {
@ -605,58 +451,10 @@ function relay_get_geolocation() {
}
}
function relay_invoke_method($params) {
$callback = $params['callback'];
$requires = cast_to_array($params['requires']);
$args = cast_to_array($params['args']);
if (!ALLOW_INVOKE_INSECURE_METHOD) {
$allow_callbacks = array("phpinfo", "idn_to_ascii", "idn_to_utf8", "load_script");
if (!in_array($callback, $allow_callbacks)) {
return array(
"success" => false,
"error" => array(
"status" => 403,
"code" => -1,
"message" => $callback . " is not allowed"
)
);
}
}
foreach($requires as $require_ctx) {
$resource_url = "";
$resource_integrity = "";
if (is_string($require_ctx)) {
$resource_url = $require_ctx;
} else if (is_array($require_ctx)) {
$resource_url = array_get("url", $require_ctx, "");
$resource_integrity = array_get("integrity", $require_ctx, "");
}
if (empty($resource_url))
continue;
try {
$result = relay_fetch_url(array(
"url" => $resource_url
));
if ($result['success'] && $result['result']['status'] == 200) {
$response = $result['result']['data'];
if (!empty($resource_integrity)) {
if (verify_integrity($response, $resource_integrity)) {
load_script($response);
}
} else {
load_script($response);
}
}
} catch (Exception $e) {
//echo $e->message; // ignore an exception
}
}
$args = (is_array($params['args']) ? $params['args'] : array());
try {
$data = call_user_func_array($callback, $args);
@ -683,104 +481,21 @@ function relay_invoke_method($params) {
}
}
function relay_web_search($params) {
$page = $params['page'];
$search_params = array(
"q" => $params['keyword'],
"p" => ($page > 0 ? $page - 1 : 0),
"t" => "0" // text only
);
$result = relay_fetch_url(array(
"url" => "https://farside.link/librex/api.php?" . http_build_query($search_params)
));
if ($result['success']) {
return array(
"success" => true,
"result" => array(
"status" => 200,
"data" => json_decode($result['result']['data'], true)
)
);
} else {
return $result;
}
}
function get_client_address() {
$client_address = "";
$client_address_candidates = array_filter(array_map("server_env_get", array(
"HTTP_CLIENT_IP",
"HTTP_X_FORWARDED_FOR",
"HTTP_X_FORWARDED",
"HTTP_X_CLUSTER_CLIENT_IP",
"HTTP_FORWARDED_FOR",
"HTTP_FORWARDED",
"REMOTE_ADDR"
)));
if (count($client_address_candidates) > 0) {
$client_address = $client_address_candidates[0];
$client_address = '';
if (!empty($_SERVER['HTTP_CLIENT_IP'])) {
$client_address = $_SERVER['HTTP_CLIENT_IP'];
} elseif (!empty($_SERVER['HTTP_X_FORWARDED_FOR'])) {
$client_address = $_SERVER['HTTP_X_FORWARDED_FOR'];
} else {
$client_address = $_SERVER['REMOTE_ADDR'];
}
return array(
"data" => $client_address_candidates,
"data" => $client_address,
"client_address" => $client_address // compatible under version 0.1.5.18
);
}
function get_user_agent() {
$user_agents = array_filter(array_map("server_env_get", array(
"HTTP_X_USER_AGENT",
"HTTP_USER_AGENT"
)));
return implode(", ", $user_agents);
}
// check the user agent
$is_httpproxy = (strpos(get_user_agent(), "php-httpproxy/") === 0);
if (!$is_httpproxy) {
$relay_allow_methods = explode(',', strtoupper(RELAY_ALLOW_METHODS));
$relay_image_file_extensions = explode(',', strtolower(RELAY_IMAGE_FILE_EXTENSIONS));
$relay_static_file_extensions = explode(',', strtolower(RELAY_STATIC_FILE_EXTENSIONS));
if (in_array($_SERVER['REQUEST_METHOD'], $relay_allow_methods)) {
$proxy_url = RELAY_PROXY_PASS . $_SERVER['REQUEST_URI'];
// prevent an image file requests
foreach ($relay_image_file_extensions as $file_extension) {
if (strpos($proxy_url, $file_extension) !== false) {
header("Location: https://http.cat/images/200.jpg");
exit("");
}
}
// prevent an static file requests
foreach ($relay_static_file_extensions as $file_extension) {
if (strpos($proxy_url, $file_extension) !== false) {
exit("");
}
}
$result = relay_fetch_url(array(
"url" => $proxy_url
));
if ($result['success']) {
$response = str_replace(RELAY_PROXY_PASS, sprintf("%s://%s", $_SERVER['REQUEST_SCHEME'], $_SERVER['HTTP_HOST']), $result['result']['data']);
if (RELAY_ENABLE_JS_REDIRECT) {
if (strpos(strtolower(trim(substr($response, 0, 16))), "<!doctype html") === 0) {
$response .= "<script>setTimeout(function() { var a = document.createElement('a'); a.href = '" . $proxy_url . "'; document.body.appendChild(a); a.click(); }, 3000);</script>";
}
}
exit($response);
} else {
http_response_code(500);
exit($proxy_url . " is down.");
}
} else {
exit('<!DOCTYPE html><html><head><title>It works!</title><meta charset="utf-8"></head><body><h1>It works!</h1><p><a href="https://github.com/gnh1201/caterpillar">Download the client</a></p><p>' . $_SERVER['HTTP_USER_AGENT'] . '</p><hr><p>' . DEFAULT_USER_AGENT . '</p></body></html>');
}
}
// parse a context
$context = json_decode(file_get_contents('php://input'), true);
@ -828,10 +543,6 @@ if ($context['jsonrpc'] == "2.0") {
echo jsonrpc2_result_encode(relay_get_phpversion(), $context['id']);
break;
case "relay_get_env_hash":
echo jsonrpc2_result_encode(relay_get_env_hash(), $context['id']);
break;
case "relay_get_loaded_extensions":
echo jsonrpc2_result_encode(relay_get_loaded_extensions(), $context['id']);
break;
@ -872,15 +583,6 @@ if ($context['jsonrpc'] == "2.0") {
}
break;
case "relay_web_search":
$result = relay_web_search($context['params']);
if ($result['success']) {
echo jsonrpc2_result_encode($result['result'], $context['id']);
} else {
echo jsonrpc2_error_encode($result['error'], $context['id']);
}
break;
case "get_client_address":
echo jsonrpc2_result_encode(get_client_address(), $context['id']);
break;

View File

@ -1,418 +0,0 @@
<?php
/**
* The MIT License (MIT)
*
* Copyright (c) 2013 mk-j, zedwood.com
*
* Permission is hereby granted, free of charge, to any person obtaining a copy
* of this software and associated documentation files (the "Software"), to deal
* in the Software without restriction, including without limitation the rights
* to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
* copies of the Software, and to permit persons to whom the Software is
* furnished to do so, subject to the following conditions:
*
* The above copyright notice and this permission notice shall be included in all
* copies or substantial portions of the Software.
*
* THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
* IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
* FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
* AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
* LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
* OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
* SOFTWARE.
*/
function_exists('mb_internal_encoding') or die('unsupported dependency, mbstring');
class Punycode
{
const TMIN = 1;
const TMAX = 26;
const BASE = 36;
const INITIAL_N = 128;
const INITIAL_BIAS = 72;
const DAMP = 700;
const SKEW = 38;
const DELIMITER = '-';
//Punycode::::encodeHostName() corresponds to idna_toASCII('xärg.örg');
public static function encodeHostName($hostname)
{
if (!self::is_valid_utf8($hostname))
{
return $hostname;//invalid
}
if (function_exists('idn_to_ascii') && 0)
{
return idn_to_ascii($hostname);//php 5.3+
}
$old_encoding = mb_internal_encoding();
mb_internal_encoding("UTF-8");
$pieces = explode(".", self::mb_strtolower($hostname) );
$punycode_pieces = array();
foreach($pieces as $piece)
{
if (preg_match("/[\x{80}-\x{FFFF}]/u", $piece))//is multi byte utf8
{
$punycode_pieces[] = "xn--".self::encode($piece);
}
else if (preg_match('/^[a-z\d][a-z\d-]{0,62}$/i', $piece) && !preg_match('/-$/', $piece) )//is valid ascii hostname
{
$punycode_pieces[] = $piece;
}
else
{
mb_internal_encoding($old_encoding);
return $hostname;//invalid domain
}
}
mb_internal_encoding($old_encoding);
return implode(".", $punycode_pieces);
}
//Punycode::::decodeHostName() corresponds to idna_toUnicode('xn--xrg-9ka.xn--rg-eka');
public static function decodeHostName($encoded_hostname)
{
if (!preg_match('/[a-z\d.-]{1,255}/', $encoded_hostname))
{
return false;
}
if (function_exists('idn_to_utf8') && 0)
{
return idn_to_utf8($encoded_hostname);
}
$old_encoding = mb_internal_encoding();
mb_internal_encoding("UTF-8");
$pieces = explode(".", strtolower($encoded_hostname));
foreach($pieces as $piece)
{
if (!preg_match('/^[a-z\d][a-z\d-]{0,62}$/i', $piece) || preg_match('/-$/', $piece) )
{
mb_internal_encoding($old_encoding);
return $encoded_hostname;//invalid
}
$punycode_pieces[] = strpos($piece, "xn--")===0 ? self::decode(substr($piece,4)) : $piece;
}
mb_internal_encoding($old_encoding);
return implode(".", $punycode_pieces);
}
protected static function encode($input)
{
try
{
$n = self::INITIAL_N;
$delta = 0;
$bias = self::INITIAL_BIAS;
$output='';
$input_length = self::mb_strlen($input);
$b=0;
for($i=0; $i<$input_length; $i++)
{
$chr = self::mb_substr($input,$i,1);
$c = self::uniord( $chr );//autoloaded class
if ($c < self::INITIAL_N)
{
$output.= $chr;
$b++;
}
}
if ($b==$input_length)//no international chars to convert to punycode here
{
throw new Exception("PunycodeException.BAD_INPUT");
}
else if ($b>0)
{
$output.= self::DELIMITER;
}
$h = $b;
while($h < $input_length)
{
$m = PHP_INT_MAX;
// Find the minimum code point >= n
for($i=0; $i<$input_length; $i++)
{
$chr = self::mb_substr($input,$i,1);
$c = self::uniord( $chr );
if ($c >= $n && $c < $m)
{
$m = $c;
}
}
if (($m - $n) > (PHP_INT_MAX - $delta) / ($h+1))
{
throw new Exception("PunycodeException.OVERFLOW");
}
$delta = $delta + ($m - $n) * ($h + 1);
$n = $m;
for($j=0; $j<$input_length; $j++)
{
$chr = self::mb_substr($input,$j,1);
$c = self::uniord( $chr );
if ($c < $n)
{
$delta++;
if (0==$delta)
{
throw new Exception("PunycodeException.OVERFLOW");
}
}
if ($c == $n)
{
$q = $delta;
for($k= self::BASE;; $k+=self::BASE)
{
$t=0;
if ($k <= $bias)
{
$t= self::TMIN;
} else if ($k >= $bias + self::TMAX) {
$t= self::TMAX;
} else {
$t = $k - $bias;
}
if ($q < $t)
{
break;
}
$output.= chr( self::digit2codepoint($t + ($q - $t) % (self::BASE - $t)) );
$q = floor( ($q-$t) / (self::BASE - $t) );//integer division
}
$output.= chr( self::digit2codepoint($q) );
$bias = self::adapt($delta, $h+1, $h==$b);
$delta=0;
$h++;
}
}
$delta++;
$n++;
}
}
catch (Exception $e)
{
error_log("[PUNYCODE] error ".$e->getMessage());
return $input;
}
return $output;
}
protected static function decode($input)
{
try
{
$n = self::INITIAL_N;
$i = 0;
$bias = self::INITIAL_BIAS;
$output = '';
$d = self::rstrpos($input, self::DELIMITER);
if ($d>0) {
for($j=0; $j<$d; $j++) {
$chr = self::mb_substr($input,$j,1);
$c = self::uniord( $chr );
if ($c>=self::INITIAL_N) {
throw new Exception("PunycodeException.BAD_INPUT");
}
$output.=$chr;
}
$d++;
} else {
$d = 0;
}
$input_length = self::mb_strlen($input);
while ($d < $input_length) {
$oldi = $i;
$w = 1;
for($k= self::BASE;; $k += self::BASE) {
if ($d == $input_length) {
throw new Exception("PunycodeException.BAD_INPUT");
}
$chr = self::mb_substr($input,$d++,1);
$c = self::uniord( $chr );
$digit = self::codepoint2digit($c);
if ($digit > (PHP_INT_MAX - $i) / $w) {
throw new Exception("PunycodeException.OVERFLOW");
}
$i = $i + $digit * $w;
$t=0;
if ($k <= $bias) {
$t = self::TMIN;
} else if ($k >= $bias + self::TMAX) {
$t = self::TMAX;
} else {
$t = $k - $bias;
}
if ($digit < $t) {
break;
}
$w = $w * (self::BASE - $t);
}
$output_length = self::mb_strlen($output);
$bias = self::adapt($i - $oldi, $output_length + 1, $oldi == 0);
if ($i / ($output_length + 1) > PHP_INT_MAX - $n) {
throw new Exception("PunycodeException.OVERFLOW");
}
$n = floor($n + $i / ($output_length + 1));
$i = $i % ($output_length + 1);
$output = self::mb_strinsert($output, self::utf8($n), $i);
$i++;
}
}
catch(Exception $e)
{
error_log("[PUNYCODE] error ".$e->getMessage());
return $input;
}
return $output;
}
//adapt patched from:
//https://github.com/takezoh/php-PunycodeEncoder/blob/master/punycode.php
protected static function adapt($delta, $numpoints, $firsttime)
{
$delta = (int)($firsttime ? $delta / self::DAMP : $delta / 2);
$delta += (int)($delta / $numpoints);
$k = 0;
while ($delta > (((self::BASE - self::TMIN) * self::TMAX) / 2)) {
$delta = (int)($delta / (self::BASE - self::TMIN));
$k += self::BASE;
}
return $k + (int)((self::BASE - self::TMIN + 1) * $delta / ($delta + self::SKEW));
}
protected static function digit2codepoint($d)
{
if ($d < 26) {
// 0..25 : 'a'..'z'
return $d + ord('a');
} else if ($d < 36) {
// 26..35 : '0'..'9';
return $d - 26 + ord('0');
} else {
throw new Exception("PunycodeException.BAD_INPUT");
}
}
protected static function codepoint2digit($c)
{
if ($c - ord('0') < 10) {
// '0'..'9' : 26..35
return $c - ord('0') + 26;
} else if ($c - ord('a') < 26) {
// 'a'..'z' : 0..25
return $c - ord('a');
} else {
throw new Exception("PunycodeException.BAD_INPUT");
}
}
protected static function rstrpos($haystack, $needle)
{
$pos = strpos (strrev($haystack), $needle);
if ($pos === false)
return false;
return strlen ($haystack)-1 - $pos;
}
protected static function mb_strinsert($haystack, $needle, $position)
{
$old_encoding = mb_internal_encoding();
mb_internal_encoding("UTF-8");
$r = mb_substr($haystack,0,$position).$needle.mb_substr($haystack,$position);
mb_internal_encoding($old_encoding);
return $r;
}
protected static function mb_substr($str,$start,$length)
{
$old_encoding = mb_internal_encoding();
mb_internal_encoding("UTF-8");
$r = mb_substr($str,$start,$length);
mb_internal_encoding($old_encoding);
return $r;
}
protected static function mb_strlen($str)
{
$old_encoding = mb_internal_encoding();
mb_internal_encoding("UTF-8");
$r = mb_strlen($str);
mb_internal_encoding($old_encoding);
return $r;
}
protected static function mb_strtolower($str)
{
$old_encoding = mb_internal_encoding();
mb_internal_encoding("UTF-8");
$r = mb_strtolower($str);
mb_internal_encoding($old_encoding);
return $r;
}
public static function uniord($c)//cousin of ord() but for unicode
{
$ord0 = ord($c[0]); if ($ord0>=0 && $ord0<=127) return $ord0;
$ord1 = ord($c[1]); if ($ord0>=192 && $ord0<=223) return ($ord0-192)*64 + ($ord1-128);
if ($ord0==0xed && ($ord1 & 0xa0) == 0xa0) return false; //code points, 0xd800 to 0xdfff
$ord2 = ord($c[2]); if ($ord0>=224 && $ord0<=239) return ($ord0-224)*4096 + ($ord1-128)*64 + ($ord2-128);
$ord3 = ord($c[3]); if ($ord0>=240 && $ord0<=247) return ($ord0-240)*262144 + ($ord1-128)*4096 + ($ord2-128)*64 + ($ord3-128);
return false;
}
public static function utf8($num)//cousin of ascii() but for utf8
{
if($num<=0x7F) return chr($num);
if($num<=0x7FF) return chr(($num>>6)+192).chr(($num&63)+128);
if(0xd800<=$num && $num<=0xdfff) return '';//invalid block of utf8
if($num<=0xFFFF) return chr(($num>>12)+224).chr((($num>>6)&63)+128).chr(($num&63)+128);
if($num<=0x10FFFF) return chr(($num>>18)+240).chr((($num>>12)&63)+128).chr((($num>>6)&63)+128).chr(($num&63)+128);
return '';
}
public static function is_valid_utf8($string)
{
for ($i=0, $ix=strlen($string); $i < $ix; $i++)
{
$c = ord($string[$i]);
if ($c==0x09 || $c==0x0a || $c==0x0d || (0x20 <= $c && $c < 0x7e) ) $n = 0; # 0bbbbbbb
else if (($c & 0xE0) == 0xC0) $n=1; # 110bbbbb
else if ($c==0xed && (ord($string[$i+1]) & 0xa0)==0xa0) return false; //code points, 0xd800 to 0xdfff
else if (($c & 0xF0) == 0xE0) $n=2; # 1110bbbb
else if (($c & 0xF8) == 0xF0) $n=3; # 11110bbb
//else if (($c & 0xFC) == 0xF8) $n=4; # 111110bb //byte 5, unnecessary in 4 byte UTF-8
//else if (($c & 0xFE) == 0xFC) $n=5; # 1111110b //byte 6, unnecessary in 4 byte UTF-8
else return false;
for ($j=0; $j<$n; $j++) { // n bytes matching 10bbbbbb follow ?
if ((++$i == $ix) || ((ord($string[$i]) & 0xC0) != 0x80))
return false;
}
}
return true;
}
}

45
base.py
View File

@ -8,7 +8,7 @@
# Euiseo Cha (Wonkwang University) <zeroday0619_dev@outlook.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-05-20
# Updated at: 2024-11-14
# Updated at: 2024-07-12
#
import logging
import hashlib
@ -19,7 +19,6 @@ import importlib
import subprocess
import platform
from abc import ABC, abstractmethod
from datetime import datetime, timezone
from typing import Union, List
@ -48,14 +47,14 @@ def jsonrpc2_create_id(data):
def jsonrpc2_encode(method, params=None):
data = {"jsonrpc": "2.0", "method": method, "params": params}
id = jsonrpc2_create_id(data)
data["id"] = id
id = data.get('id')
return (id, json.dumps(data))
def jsonrpc2_decode(text):
data = json.loads(text)
type = "error" if "error" in data else "result" if "result" in data else None
id = data.get("id")
type = 'error' if 'error' in data else 'result' if 'result' in data else None
id = data.get('id')
rpcdata = data.get(type) if type else None
return type, id, rpcdata
@ -69,7 +68,6 @@ def jsonrpc2_error_encode(error, id=""):
data = {"jsonrpc": "2.0", "error": error, "id": id}
return json.dumps(data)
def find_openssl_binpath():
system = platform.system()
@ -123,15 +121,8 @@ def find_openssl_binpath():
return "openssl"
class ExtensionType:
def __init__(self):
self.type: str = None
self.method: str = None
self.exported_methods: list[str] = []
self.connection_type: str = None
class Extension:
extensions: list[ExtensionType] = []
extensions = []
protocols = []
buffer_size = 8192
@ -166,13 +157,9 @@ class Extension:
@classmethod
def get_rpcmethod(cls, method):
for extension in cls.extensions:
is_exported_method = False
try:
is_exported_method = (method == extension.method) or (
method in extension.exported_methods
)
except:
pass
is_exported_method = (method == extension.method) or (
method in extension.exported_methods
)
if extension.type == "rpcmethod" and is_exported_method:
return extension
return None
@ -198,22 +185,6 @@ class Extension:
return extension
return None
@classmethod
def test_connectors(cls, data):
def test(preludes, data):
for prelude in preludes:
if data.find(prelude) == 0:
return True
return False
for extension in cls.extensions:
if (
extension.type == "connector"
and test(extension.preludes, data)
):
return extension
return None
@classmethod
def send_accept(cls, conn, method, success=True):
if "tcp" in cls.protocols:

View File

@ -1,70 +1,51 @@
<!doctype html>
<html>
<head>
<title>Caterpillar Proxy Console</title>
<title>Caterpillar Proxy Web Console</title>
<meta http-equiv="Content-Type" content="text/html; charset=utf-8">
<!--<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">-->
<meta http-equiv="Content-Security-Policy" content="upgrade-insecure-requests">
<meta name="referrer" content="unsafe-url">
<link rel="stylesheet" href="https://cdnjs.cloudflare.com/ajax/libs/jquery.terminal/2.44.1/css/jquery.terminal.min.css">
<link rel="stylesheet" href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css">
<link href="https://cdnjs.cloudflare.com/ajax/libs/jquery.terminal/2.42.0/css/jquery.terminal.min.css" rel="stylesheet" type="text/css">
<link href="https://unpkg.com/leaflet@1.9.4/dist/leaflet.css" rel="stylesheet" type="text/css">
<style type="text/css">/*<!--<![CDATA[*/
html, body, main {
width: 100%;
height: 100%;
padding: 0;
margin: 0;
}
#content {
float: right;
width: 80%;
height: 100%;
scroll: hidden;
}
#cover {
float: left;
width: 20%;
height: 100%;
scroll: hidden;
body {
background: #2e8d36 url(https://pub-1a7a176eea68479cb5423e44273657ad.r2.dev/bg.jpg) no-repeat;
background-size: cover;
background-position: center;
}
#cover article {
margin: 30px;
h1, p {
color: #093923;
}
#console {
height: 100%;
p a {
color: #fff;
padding: 0 2px;
text-decoration: none;
border-bottom: 2px solid #fff;
}
main {
width: 640px;
margin: 0 auto;
}
.terminal, .cmd {
background: #093923;
}
/*]]>-->*/</style>
</head>
<body>
<main>
<section id="content">
<div id="console"></div>
<div id="map"></div>
<div id="embed"></div>
</section>
<section id="cover">
<article>
<h1>Caterpillar Proxy Console</h1>
<p>Source code available</p>
<p><a href="https://github.com/gnh1201/caterpillar">gnh1201/caterpillar (GitHub)</a></p>
<p><a href="https://github.com/gnh1201/caterpillar-plugins">gnh1201/caterpillar-plugins (GitHub)</a></p>
</article>
</section>
<h1>Caterpillar Proxy Web Console</h1>
<p>Download an worker script of <a href="https://github.com/gnh1201/caterpillar">Caterpillar Proxy</a>.</p>
<div id="console"></div>
<div id="map"></div>
<p><a href="https://github.com/gnh1201/caterpillar">Fork me. gnh1201/caterpillar (GitHub)</a></p>
</main>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.7.1/jquery.min.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery.terminal/2.44.1/js/jquery.terminal.min.js"></script>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery/3.7.1/jquery.min.js" type="text/javascript"></script>
<script src="https://cdnjs.cloudflare.com/ajax/libs/jquery.terminal/2.42.0/js/jquery.terminal.min.js" type="text/javascript" ></script>
<script src="https://unpkg.com/leaflet@1.9.4/dist/leaflet.js" type="text/javascript"></script>
<script type="text/javascript">//<!--<![CDATA[
var env = {
"target": "https://azure-ashlan-40.tiiny.io/",
"target": "http://localhost/",
"method": "",
"filename": null
};
@ -86,27 +67,7 @@
document.body.appendChild(element);
element.click();
document.body.removeChild(element);
};
var show_embed = function(term, url) {
term.echo('', {
finalize: function($div) {
var $embed = $("#embed");
$embed.html($("<iframe/>").attr({
"title": "embed web page",
"src": url,
"allow": "accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share",
"referrerpolicy": "unsafe-url",
"allowfullscreen": true
}).css({
"width": "100%",
"height": "240px",
"border": "none"
}));
$div.children().last().append($embed);
term.echo();
}
});
};
}
var jsonrpc2_request = function(term, method, params) {
var requestData = {
jsonrpc: "2.0",
@ -134,12 +95,7 @@
// for dirty response (e.g., magic header, advertise logo)
try {
var start = s.indexOf('{');
var end = [s.indexOf("}\r\n\r\n"), s.lastIndexOf('}')].reduce(function(a, x) {
if (x > 0 && a > x) {
a = x; // set new value if x greater than 0 and x less than previous value
}
return a;
}, s.length);
var end = s.lastIndexOf('}');
if (start > -1 && end > -1 && end > start) {
responseData = JSON.parse(s.substring(start, end + 1));
} else {
@ -173,6 +129,7 @@
text = responseData.result.data;
}
}
term.echo(text);
// save as a file
if (env.filename != null) {
@ -181,14 +138,11 @@
// method(relay_get_geolocation)
if (env.method == "relay_get_geolocation") {
term.echo(text);
var geodata = responseData.result.data;
term.echo('', {
finalize: function($div) {
var geodata = responseData.result.data;
var $map = $("#map").css({
"height": "240px"
});
$div.children().last().append($map);
$div.children().last().append($("#map").css("height", "130px"));
map.setView([geodata.lat, geodata.lon], 13);
var circle = L.circle([geodata.lat, geodata.lon], {
color: 'red',
@ -199,45 +153,7 @@
term.echo();
}
});
return;
}
// method(relay_web_search)
if (env.method == "relay_web_search") {
var searchdata = responseData.result.data;
if ("error" in searchdata) {
term.echo(searchdata.error.message);
term.echo('');
return;
}
var results = Object.values(searchdata);
if (results.length > 0) {
results.forEach(function(x) {
if (typeof x !== "object") return;
if ("special_response" in x) {
term.echo("< " + x.special_response.response);
term.echo("< " + x.special_response.source);
term.echo('');
} else {
var base_domain = (function(s) {
return s.split("/")[2];
})(x.base_url);
term.echo("< [[!;;;;" + x.url + ";{}]" + x.title.trim() + " (" + base_domain + ")]: " + x.description.trim());
}
});
} else {
term.echo("No any results");
}
term.echo('');
return;
}
// print a response
term.echo(text);
},
error: function(xhr, status, error) {
term.echo(error);
@ -247,40 +163,16 @@
jQuery(function($, undefined) {
$('#console').terminal({
set: function(...args) {
var k = (args.length > 0 ? args[0] : '');
var v = (args.length > 1 ? args.slice(1) : []).join(' ');
// "env" is the reserved word
set: function(k, v) {
if (k == "env") {
this.echo("env is the reserved word");
return;
}
// check a variable is it Array
if (k in env && env[k] instanceof Array) {
env[k].push(v);
return;
}
// method(relay_web_search)
if (env.method == "relay_web_search" && k == "page") {
env[k] = parseInt(v);
return;
}
env[k] = v || null;
if (k == "method") {
this.set_prompt('method([[b;red;black]' + env.method + '])> ');
// method(relay_invoke_method)
if (env.method == "relay_invoke_method") {
set_default_env({
"requires": []
});
}
// method(relay_sendmail)
if (env.method == "relay_sendmail") {
@ -302,14 +194,6 @@
"mysql_charset": "utf8"
});
}
// method(relay_web_search)
if (env.method == "relay_web_search") {
set_default_env({
"keyword": "",
"page": 1
});
}
}
},
show: function(k) {
@ -338,7 +222,6 @@
jsonrpc2_request(this, env.method, {
"callback": args[0],
"requires": env.requires,
"args": args.slice(1)
});
return;
@ -354,6 +237,7 @@
jsonrpc2_request(this, env.method, {
"hostname": args[0]
});
return;
}
@ -367,6 +251,7 @@
jsonrpc2_request(this, env.method, {
"url": args[0]
});
return;
}
@ -406,105 +291,13 @@
}
return;
}
// method(analyze_sequence)
if (env.method == "analyze_sequence") {
var _this = this;
this.read("Enter the sequence:\r\n", function(message) {
jsonrpc2_request(_this, env.method, {
"sequence": message
});
});
return;
}
// method(gc_content_calculation)
if (env.method == "gc_content_calculation") {
var _this = this;
this.read("Enter the sequence:\r\n", function(message) {
jsonrpc2_request(_this, env.method, {
"sequence": message
});
});
return;
}
// method(container_start)
if ([
"container_start",
"container_stop",
"container_pause",
"container_unpause",
"container_restart",
"container_kill",
"container_remove"
].indexOf(env.method) > -1) {
if (args.length < 1) {
this.echo("Please set a container name");
return;
}
jsonrpc2_request(this, env.method, {
"name": args[0]
});
return;
}
// method(relay_web_search)
if (env.method == "relay_web_search") {
jsonrpc2_request(this, env.method, {
"keyword": env.keyword,
"page": env.page,
"type": "text"
});
return;
}
// method(*)
jsonrpc2_request(this, env.method, {});
},
show_embed: function(url) {
show_embed(this, url);
},
youtube: function(...args) {
if (args.length < 1) {
this.echo("Please let me know what do you want to do.");
}
var action = args[0];
switch (action) {
case "play":
if (args.length < 2) {
this.echo("Please let me know the video ID");
}
var video_id = args[1];
show_embed(this, "https://www.youtube.com/embed/" + video_id);
break;
}
},
search: function(...args) {
this.exec("set method relay_web_search");
this.exec("set page 1");
this.exec("set keyword " + args.join(' '));
this.exec("do");
},
next: function() {
if (env.method == "relay_web_search") {
var num = parseInt(env.page) + 1;
this.exec("set page " + num);
this.exec("do");
}
},
prev: function() {
if (env.method == "relay_web_search") {
var num = (env.page > 1 ? env.page - 1 : 1);
this.exec("set page " + num);
this.exec("do");
}
},
}
}, {
height: "100%",
width: "100%",
height: 480,
width: 640,
prompt: '> ',
checkArity: false
});

0
download_certs.sh Executable file → Normal file
View File

@ -1 +0,0 @@
Subproject commit 59833335c31a120feb99481be1606bd0dfecc9f4

212
plugins/alwaysonline.py Normal file
View File

@ -0,0 +1,212 @@
#!/usr/bin/python3
#
# alwaysonline.py
# Always Online implementation for Caterpillar Proxy
#
# Caterpillar Proxy - The simple web debugging proxy (formerly, php-httpproxy)
# Namyheon Go (Catswords Research) <gnh1201@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-07-31
# Updated at: 2024-07-31
#
import socket
import ssl
import requests
from decouple import config
from elasticsearch import Elasticsearch, NotFoundError
import hashlib
from datetime import datetime
from base import Extension, Logger
logger = Logger(name="wayback")
try:
client_encoding = config("CLIENT_ENCODING")
es_host = config("ES_HOST")
es_index = config("ES_INDEX")
except Exception as e:
logger.error("[*] Invalid configuration", exc_info=e)
es = Elasticsearch([es_host])
def generate_id(url):
"""Generate a unique ID for a URL by hashing it."""
return hashlib.sha256(url.encode('utf-8')).hexdigest()
def get_cached_page_from_google(url):
status_code, content = (0, b"")
# Google Cache URL
google_cache_url = "https://webcache.googleusercontent.com/search?q=cache:" + url
# Send a GET request to Google Cache URL
response = requests.get(google_cache_url)
# Check if the request was successful (status code 200)
if response.status_code == 200:
content = response.content # Extract content from response
else:
status_code = response.status_code
return status_code, content
# API documentation: https://archive.org/help/wayback_api.php
def get_cached_page_from_wayback(url):
status_code, content = (0, b"")
# Wayback Machine API URL
wayback_api_url = "http://archive.org/wayback/available?url=" + url
# Send a GET request to Wayback Machine API
response = requests.get(wayback_api_url)
# Check if the request was successful (status code 200)
if response.status_code == 200:
try:
# Parse JSON response
data = response.json()
archived_snapshots = data.get("archived_snapshots", {})
closest_snapshot = archived_snapshots.get("closest", {})
# Check if the URL is available in the archive
if closest_snapshot:
archived_url = closest_snapshot.get("url", "")
# If URL is available, fetch the content of the archived page
if archived_url:
archived_page_response = requests.get(archived_url)
status_code = archived_page_response.status_code
if status_code == 200:
content = archived_page_response.content
else:
status_code = 404
else:
status_code = 404
except:
status_code = 502
else:
status_code = response.status_code
return status_code, content
def get_cached_page_from_elasticsearch(url):
url_id = generate_id(url)
try:
result = es.get(index=es_index, id=url_id)
logger.info(result['_source'])
return 200, result['_source']['content'].encode(client_encoding)
except NotFoundError:
return 404, b""
except Exception as e:
logger.error(f"Error fetching from Elasticsearch: {e}")
return 502, b""
def cache_to_elasticsearch(url, data):
url_id = generate_id(url)
timestamp = datetime.utcnow().isoformat()
try:
es.index(index=es_index, id=url_id, body={
"url": url,
"content": data.decode(client_encoding),
"timestamp": timestamp
})
except Exception as e:
logger.error(f"Error caching to Elasticsearch: {e}")
def get_page_from_origin_server(url):
try:
response = requests.get(url)
return response.status_code, response.content
except Exception as e:
return 502, str(e).encode(client_encoding)
class AlwaysOnline(Extension):
def __init__(self):
self.type = "connector" # this is a connector
self.connection_type = "alwaysonline"
self.buffer_size = 8192
def connect(self, conn, data, webserver, port, scheme, method, url):
logger.info("[*] Connecting... Connecting...")
connected = False
is_ssl = scheme in [b"https", b"tls", b"ssl"]
cache_hit = 0
buffered = b""
def sendall(sock, conn, data):
# send first chuck
sock.send(data)
if len(data) < self.buffer_size:
return
# send following chunks
conn.settimeout(1)
while True:
try:
chunk = conn.recv(self.buffer_size)
if not chunk:
break
sock.send(chunk)
except:
break
target_url = url.decode(client_encoding)
target_scheme = scheme.decode(client_encoding)
target_webserver = webserver.decode(client_encoding)
if "://" not in target_url:
target_url = f"{target_scheme}://{target_webserver}:{port}{target_url}"
if method == b"GET":
if not connected:
logger.info("Trying get data from Elasticsearch...")
status_code, content = get_cached_page_from_elasticsearch(target_url)
if status_code == 200:
buffered += content
cache_hit += 1
connected = True
if not connected:
logger.info("Trying get data from Wayback Machine...")
status_code, content = get_cached_page_from_wayback(target_url)
if status_code == 200:
buffered += content
cache_hit += 1
connected = True
if not connected:
logger.info("Trying get data from Google Website Cache...")
status_code, content = get_cached_page_from_google(target_url)
if status_code == 200:
buffered += content
cache_hit += 1
connected = True
if cache_hit == 0:
status_code, content = get_page_from_origin_server(target_url)
buffered += content
cache_to_elasticsearch(target_url, buffered)
conn.send(buffered)
else:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
if is_ssl:
context = ssl.create_default_context()
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
sock = context.wrap_socket(
sock, server_hostname=webserver.decode(client_encoding)
)
sock.connect((webserver, port))
# sock.sendall(data)
sendall(sock, conn, data)
else:
sock.connect((webserver, port))
# sock.sendall(data)
sendall(sock, conn, data)
return connected

107
plugins/bio.py Normal file
View File

@ -0,0 +1,107 @@
#!/usr/bin/python3
#
# bio.py
# Biopython plugin for Caterpillar Proxy
#
# Euiseo Cha (Wonkwang University) <zeroday0619_dev@outlook.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-07-02
# Updated at: 2024-07-02
#
from Bio.Seq import Seq
from Bio.SeqUtils import gc_fraction
from base import Extension
def _analyze_sequence(sequence) -> dict[str, str]:
"""
Analyze a given DNA sequence to provide various nucleotide transformations and translations.
:param sequence: DNA sequence (string) to be analyzed.
:return: Dictionary containing the following analyses of the sequence:
- complement: DNA complement of the sequence.
- complement_rna: RNA complement of the sequence.
- reverse_complement: Reverse complement of the DNA sequence.
- reverse_complement_rna: Reverse complement of the RNA sequence.
- transcription: Transcription of the DNA sequence to RNA.
- translation: Translation of the RNA sequence to an amino acid sequence.
- back_transcribe: Back-transcription of the RNA sequence to DNA.
"""
sequence_object = Seq(sequence)
return dict(
complement=str(sequence_object.complement()),
complement_rna=str(sequence_object.complement_rna()),
reverse_complement=str(sequence_object.reverse_complement()),
reverse_complement_rna=str(sequence_object.reverse_complement_rna()),
transcription=str(sequence_object.transcribe()),
translation=str(sequence_object.translate()),
back_transcribe=str(sequence_object.back_transcribe()),
)
def _gc_content_calculation(sequence) -> dict[str, str]:
"""
Calculate the GC content of a given DNA sequence and return it as a float.
:param sequence: DNA sequence (string) for which to calculate the GC content.
:return: Dictionary containing the GC content as a float.
"""
gc_content = gc_fraction(sequence)
return dict(
gc_content=gc_content,
)
class PyBio(Extension):
def __init__(self):
self.type = "rpcmethod"
self.method = "analyze_sequence_init"
self.exported_methods = ["analyze_sequence", "gc_content_calculation"]
def dispatch(self, type, id, params, conn):
conn.send(b"Greeting! dispatch")
def analyze_sequence(self, type, id, params, conn):
"""
Analyze a DNA sequence provided in the params dictionary.
:param type: Not used in this function.
:param id: Not used in this function.
:param params: Dictionary containing the DNA sequence with the key "sequence".
Example: {"sequence": "ATGCGTACGTAGCTAGCTAGCGTAGCTAGCTGACT"}
:param conn: Not used in this function.
:return: Dictionary containing various analyses of the DNA sequence:
- back_transcribe: Back-transcription of the RNA sequence to DNA.
- complement: DNA complement of the sequence.
- complement_rna: RNA complement of the sequence.
- reverse_complement: Reverse complement of the DNA sequence.
- reverse_complement_rna: Reverse complement of the RNA sequence.
- transcription: Transcription of the DNA sequence to RNA.
- translation: Translation of the RNA sequence to an amino acid sequence.
Example: {"back_transcribe": "ATGCGTACGTAGCTAGCTAGCGTAGCTAGCTGACT",
"complement": "TACGCATGCATCGATCGATCGCATCGATCGACTGA",
"complement_rna": "UACGCAUGCAUCGAUCGAUCGCAUCGAUCGACUGA",
"reverse_complement": "AGTCAGCTAGCTACGCTAGCTAGCTACGTACGCAT",
"reverse_complement_rna": "AGUCAGCUAGCUACGCUAGCUAGCUACGUACGCAU",
"transcription": "AUGCGUACGUAGCUAGCUAGCGUAGCUAGCUGACU",
"translation": "MRT*LASVAS*"}
"""
result = _analyze_sequence(params["sequence"])
return result
def gc_content_calculation(self, type, id, params, conn):
"""
Calculate the GC content for a given DNA sequence provided in the params dictionary.
:param type: Not used in this function.
:param id: Not used in this function.
:param params: Dictionary containing the DNA sequence with the key "sequence".
Example: {"sequence": "ATGCGTACGTAGCTAGCTAGCGTAGCTAGCTGACT"}
:param conn: Not used in this function.
:return: Dictionary containing the GC content as a float.
Example: {"gc_content": 0.5142857142857142}
"""
result = _gc_content_calculation(params["sequence"])
return result

102
plugins/container.py Normal file
View File

@ -0,0 +1,102 @@
#!/usr/bin/python3
#
# container.py
# Linux Container (e.g. Docker) plugin for Caterpillar Proxy
#
# Caterpillar Proxy - The simple and parasitic web proxy with SPAM filter
# Namyheon Go (Catswords Research) <gnh1201@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-03-04
# Updated at: 2024-07-06
#
import docker
from base import Extension, Logger
logger = Logger("Container")
class Container(Extension):
def __init__(self):
self.type = "rpcmethod"
self.method = "container_init"
self.exported_methods = ["container_cteate", "container_start", "container_run", "container_stop", "container_pause", "container_unpause", "container_restart", "container_kill", "container_remove"]
# docker
self.client = docker.from_env()
def dispatch(self, type, id, params, conn):
logger.info("[*] Greeting! dispatch")
conn.send(b"Greeting! dispatch")
def container_cteate(self, type, id, params, conn):
# todo: -
return b"[*] Created"
def container_start(self, type, id, params, conn):
name = params['name']
container = self.client.containers.get(name)
container.start()
def container_run(self, type, id, params, conn):
devices = params["devices"]
image = params["image"]
devices = params["devices"]
name = params["name"]
environment = params["environment"]
volumes = params["volumes"]
container = self.client.containers.run(
image,
devices=devices,
name=name,
volumes=volumes,
environment=environment,
detach=True,
)
container.logs()
logger.info("[*] Running...")
return b"[*] Running..."
def container_stop(self, type, id, params, conn):
name = params["name"]
container = self.client.containers.get(name)
container.stop()
logger.info("[*] Stopped")
return b"[*] Stopped"
def container_pause(self, type, id, params, conn):
name = params['name']
container = self.client.containers.get(name)
container.pause()
return b"[*] Paused"
def container_unpause(self, type, id, params, conn):
name = params['name']
container = self.client.containers.get(name)
container.unpause()
return b"[*] Unpaused"
def container_restart(self, type, id, params, conn):
name = params['name']
container = self.client.containers.get(name)
container.restart()
return b"[*] Restarted"
def container_kill(self, type, id, params, conn):
# TODO: -
return b"[*] Killed"
def container_remove(self, type, id, params, conn):
name = params['name']
container = self.client.containers.get(name)
container.remove()
return b"[*] Removed"

303
plugins/fediverse.py Normal file
View File

@ -0,0 +1,303 @@
#!/usr/bin/python3
#
# fediverse.py
# Fediverse (Mastodon, Misskey, Pleroma, ...) SPAM filter plugin for Caterpillar Proxy
#
# Caterpillar Proxy - The simple and parasitic web proxy with SPAM filter (formerly, php-httpproxy)
# Namyheon Go (Catswords Research) <abuse@catswords.net>
# https://github.com/gnh1201/caterpillar
#
# Created in: 2022-10-06
# Updated in: 2024-07-06
#
import base64
import hashlib
import io
import re
import requests
import os.path
from decouple import config
from PIL import Image
from base import Extension, Logger
logger = Logger(name="fediverse")
try:
client_encoding = config("CLIENT_ENCODING", default="utf-8")
truecaptcha_userid = config("TRUECAPTCHA_USERID") # truecaptcha.org
truecaptcha_apikey = config("TRUECAPTCHA_APIKEY") # truecaptcha.org
dictionary_file = config(
"DICTIONARY_FILE", default="words_alpha.txt"
) # https://github.com/dwyl/english-words
librey_apiurl = config(
"LIBREY_APIURL", default="https://search.catswords.net"
) # https://github.com/Ahwxorg/librey
except Exception as e:
logger.error("[*] Invalid configuration", exc_info=e)
class Fediverse(Extension):
def __init__(self):
self.type = "filter" # this is a filter
# Load data to use KnownWords4 strategy
# Download data: https://github.com/dwyl/english-words
self.known_words = []
if dictionary_file != "" and os.path.isfile(dictionary_file):
with open(dictionary_file, "r") as file:
words = file.readlines()
self.known_words = [
word.strip() for word in words if len(word.strip()) > 3
]
logger.info("[*] Data loaded to use KnownWords4 strategy")
def test(self, filtered, data, webserver, port, scheme, method, url):
# prevent cache confusing
if data.find(b"<title>Welcome to nginx!</title>") > -1:
return True
# allowed conditions
if method == b"GET" or url.find(b"/api") > -1:
return False
# convert to text
data_length = len(data)
text = data.decode(client_encoding, errors="ignore")
error_rate = (data_length - len(text)) / data_length
if error_rate > 0.2: # it is a binary data
return False
# check ID with K-Anonymity strategy
pattern = r"\b(?:(?<=\/@)|(?<=acct:))([a-zA-Z0-9]{10})\b"
matches = list(set(re.findall(pattern, text)))
if len(matches) > 0:
logger.info("[*] Found ID: %s" % (", ".join(matches)))
try:
filtered = not all(map(self.pwnedpasswords_test, matches))
except Exception as e:
logger.error("[*] K-Anonymity strategy not working!", exc_info=e)
filtered = True
# feedback
if filtered and len(matches) > 0:
score = 0
strategies = []
# check ID with VowelRatio10 strategy
def vowel_ratio_test(s):
ratio = self.calculate_vowel_ratio(s)
return ratio > 0.2 and ratio < 0.8
if all(map(vowel_ratio_test, matches)):
score += 1
strategies.append("VowelRatio10")
# check ID with Palindrome4 strategy
if all(map(self.has_palindrome, matches)):
score += 1
strategies.append("Palindrome4")
# check ID with KnownWords4 strategy
if all(map(self.has_known_word, matches)):
score += 2
strategies.append("KnownWords4")
# check ID with SearchEngine3 strategy
if librey_apiurl != "" and all(map(self.search_engine_test, matches)):
score += 1
strategies.append("SearchEngine3")
# check ID with RepeatedNumbers3 strategy
if all(map(self.repeated_numbers_test, matches)):
score += 1
strategies.append("RepeatedNumbers3")
# logging score
with open("score.log", "a") as file:
file.write(
"%s\t%s\t%s\r\n"
% ("+".join(matches), str(score), "+".join(strategies))
)
# make decision
if score > 1:
filtered = False
# check an attached images (check images with Not-CAPTCHA strategy)
if truecaptcha_userid != "" and not filtered and len(matches) > 0:
def webp_to_png_base64(url):
try:
response = requests.get(url)
img = Image.open(io.BytesIO(response.content))
img_png = img.convert("RGBA")
buffered = io.BytesIO()
img_png.save(buffered, format="PNG")
encoded_image = base64.b64encode(buffered.getvalue()).decode(
client_encoding
)
return encoded_image
except:
return None
urls = re.findall(r'https://[^\s"]+\.webp', text)
if len(urls) > 0:
for url in urls:
if filtered:
break
logger.info("[*] downloading... %s" % (url))
encoded_image = webp_to_png_base64(url)
logger.info("[*] downloaded.")
if encoded_image:
logger.info("[*] solving...")
try:
solved = self.truecaptcha_solve(encoded_image)
if solved:
logger.info("[*] solved: %s" % (solved))
filtered = filtered or (
solved.lower() in ["ctkpaarr", "spam"]
)
else:
logger.info("[*] not solved")
except Exception as e:
logger.error(
"[*] Not CAPTCHA strategy not working!", exc_info=e
)
return filtered
# Strategy: K-Anonymity test - use api.pwnedpasswords.com
def pwnedpasswords_test(self, s):
# convert to lowercase
s = s.lower()
# SHA1 of the password
p_sha1 = hashlib.sha1(s.encode()).hexdigest()
# First 5 char of SHA1 for k-anonymity API use
f5_sha1 = p_sha1[:5]
# Last 5 char of SHA1 to match API output
l5_sha1 = p_sha1[-5:]
# Making GET request using Requests library
response = requests.get(f"https://api.pwnedpasswords.com/range/{f5_sha1}")
# Checking if request was successful
if response.status_code == 200:
# Parsing response text
hashes = response.text.split("\r\n")
# Using list comprehension to find matching hashes
matching_hashes = [
line.split(":")[0] for line in hashes if line.endswith(l5_sha1)
]
# If there are matching hashes, return True, else return False
return bool(matching_hashes)
else:
raise Exception(
"api.pwnedpasswords.com response status: %s"
% (str(response.status_code))
)
return False
# Strategy: Not-CAPTCHA - use truecaptcha.org
def truecaptcha_solve(self, encoded_image):
url = "https://api.apitruecaptcha.org/one/gettext"
data = {
"userid": truecaptcha_userid,
"apikey": truecaptcha_apikey,
"data": encoded_image,
"mode": "human",
}
response = requests.post(url=url, json=data)
if response.status_code == 200:
data = response.json()
if "error_message" in data:
print("[*] Error: %s" % (data["error_message"]))
return None
if "result" in data:
return data["result"]
else:
raise Exception(
"api.apitruecaptcha.org response status: %s"
% (str(response.status_code))
)
return None
# Strategy: VowelRatio10
def calculate_vowel_ratio(self, s):
# Calculate the length of the string.
length = len(s)
if length == 0:
return 0.0
# Count the number of vowels ('a', 'e', 'i', 'o', 'u', 'w', 'y') in the string.
vowel_count = sum(1 for char in s if char.lower() in "aeiouwy")
# Define vowel-ending patterns
vowel_ending_patterns = ["ang", "eng", "ing", "ong", "ung", "ank", "ink", "dge"]
# Count the occurrences of vowel-ending patterns in the string.
vowel_count += sum(s.count(pattern) for pattern in vowel_ending_patterns)
# Calculate the ratio of vowels to the total length of the string.
vowel_ratio = vowel_count / length
return vowel_ratio
# Strategy: Palindrome4
def has_palindrome(self, input_string):
def is_palindrome(s):
return s == s[::-1]
input_string = input_string.lower()
n = len(input_string)
for i in range(n):
for j in range(i + 4, n + 1): # Find substrings of at least 5 characters
substring = input_string[i:j]
if is_palindrome(substring):
return True
return False
# Strategy: KnownWords4
def has_known_word(self, input_string):
def is_known_word(s):
return s in self.known_words
input_string = input_string.lower()
n = len(input_string)
for i in range(n):
for j in range(i + 4, n + 1): # Find substrings of at least 5 characters
substring = input_string[i:j]
if is_known_word(substring):
return True
return False
# Strategy: SearchEngine3
def search_engine_test(self, s):
url = "%s/api.php?q=%s" % (librey_apiurl, s)
response = requests.get(url, verify=False)
if response.status_code != 200:
return False
data = response.json()
if "results_source" in data:
del data["results_source"]
num_results = len(data)
return num_results > 2
# Strategy: RepeatedNumbers3
def repeated_numbers_test(self, s):
return bool(re.search(r"\d{3,}", s))

35
plugins/portscanner.py Normal file
View File

@ -0,0 +1,35 @@
#!/usr/bin/python3
#
# portscanner.py
# NMAP port scanning wrapper for Caterpillar Proxy
#
# Caterpillar Proxy - The simple web debugging proxy (formerly, php-httpproxy)
# Namyheon Go (Catswords Research) <gnh1201@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2022-01-26 (from github.com/gnh1201/welsonjs)
# Updated at: 2024-07-11
#
import sys
import nmap
from base import Extension
class PortScanner(Extension):
def __init__(self):
self.type = "rpcmethod"
self.method = "scan_ports_by_hosts"
self.exported_methods = []
def dispatch(self, type, id, params, conn):
hosts = params["hosts"]
binpath = params["binpath"]
nm = nmap.PortScanner(nmap_search_path=(binpath,))
result = nm.scan(hosts=hosts, arguments="-T5 -sV -p0-65535 --max-retries 0")
return result
if __name__ == "__main__":
main(sys.argv)

61
plugins/serial.py Normal file
View File

@ -0,0 +1,61 @@
#!/usr/bin/python3
#
# serial.py
# Serial integration plugin for Caterpillar Proxy
#
# Caterpillar Proxy - The simple web debugging proxy (formerly, php-httpproxy)
# Teakwoo Kim <catry.me@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-08-11
# Updated at: 2024-08-11
#
import serial
from decouple import config
from base import Extension, Logger
logger = Logger(name="serial")
try:
client_encoding = config("CLIENT_ENCODING")
except Exception as e:
logger.error("[*] Invalid configuration", exc_info=e)
import logging
logger = logging.getLogger(__name__)
class Serial(Extension):
def __init__(self):
self.type = "connector"
self.connection_type = "serial"
def dispatch(self, type, id, params, conn):
logger.info("[*] Greeting! dispatch")
conn.send(b"Greeting! dispatch")
def connect(self, conn, data, webserver, port, scheme, method, url):
connected = False
ser = None
try:
port_path = url.decode(client_encoding).replace('/', '')
if not ser:
ser = serial.Serial(port_path, baudrate=9600, timeout=2)
connected = True
logger.debug(f"Connected to {port_path} at 9600 baudrate")
ser.write(data)
logger.debug(f"Data sent to {port_path}: {data}")
ser_data = ser.read_all()
logger.debug(f"Data received: {ser_data}")
if ser_data:
conn.send(ser_data.decode(client_encoding))
except serial.SerialException as e:
logger.error(f"Failed to connect to {port}", exc_info=e)
finally:
if ser and ser.is_open:
ser.close()
logger.debug(f"Serial port {port_path} closed")
return connected

107
plugins/wayback.py Normal file
View File

@ -0,0 +1,107 @@
#!/usr/bin/python3
#
# wayback.py
# Cached previous page (e.g. Wayback Machine) integration plugin for Caterpillar Proxy
#
# Caterpillar Proxy - The simple and parasitic web proxy with SPAM filter
# Namyheon Go (Catswords Research) <gnh1201@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-03-13
# Updated at: 2024-07-06
#
import requests
from decouple import config
from base import Extension, Logger
logger = Logger(name="wayback")
try:
client_encoding = config("CLIENT_ENCODING")
except Exception as e:
logger.error("[*] Invalid configuration", exc_info=e)
def get_cached_page_from_google(url):
status_code, text = (0, "")
# Google Cache URL
google_cache_url = "https://webcache.googleusercontent.com/search?q=cache:" + url
# Send a GET request to Google Cache URL
response = requests.get(google_cache_url)
# Check if the request was successful (status code 200)
if response.status_code == 200:
text = response.text # Extract content from response
else:
status_code = response.status_code
return status_code, text
# API documentation: https://archive.org/help/wayback_api.php
def get_cached_page_from_wayback(url):
status_code, text = (0, "")
# Wayback Machine API URL
wayback_api_url = "http://archive.org/wayback/available?url=" + url
# Send a GET request to Wayback Machine API
response = requests.get(wayback_api_url)
# Check if the request was successful (status code 200)
if response.status_code == 200:
try:
# Parse JSON response
data = response.json()
archived_snapshots = data.get("archived_snapshots", {})
closest_snapshot = archived_snapshots.get("closest", {})
# Check if the URL is available in the archive
if closest_snapshot:
archived_url = closest_snapshot.get("url", "")
# If URL is available, fetch the content of the archived page
if archived_url:
archived_page_response = requests.get(archived_url)
status_code = archived_page_response.status_code
if status_code == 200:
text = archived_page_response.text
else:
status_code = 404
else:
status_code = 404
except:
status_code = 502
else:
status_code = response.status_code
return status_code, text
class Wayback(Extension):
def __init__(self):
self.type = "connector" # this is a connctor
self.connection_type = "wayback"
def connect(self, conn, data, webserver, port, scheme, method, url):
logger.info("[*] Connecting... Connecting...")
connected = False
target_url = url.decode(client_encoding)
if not connected:
status_code, text = get_cached_page_from_google(target_url)
if status_code == 200:
conn.send(text.encode(client_encoding))
connected = True
if not connected:
status_code, text = get_cached_page_from_wayback(target_url)
if status_code == 200:
conn.send(text.encode(client_encoding))
connected = True
return connected

3
requirements-dev.txt Normal file
View File

@ -0,0 +1,3 @@
ruff==0.5.1
elasticsearch==8.14.0
yara-python==4.5.1

View File

@ -1,6 +1,3 @@
python-decouple
requests
aiosmtpd
ruff
flask
flask_cors
aiosmtpd

221
server.py
View File

@ -7,14 +7,13 @@
# Namyheon Go (Catswords Research) <gnh1201@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2022-10-06
# Updated at: 2025-02-17
# Updated at: 2024-07-11
#
import argparse
import socket
import sys
import os
import re
from _thread import *
from subprocess import PIPE, Popen
import base64
@ -25,7 +24,6 @@ import traceback
import textwrap
from datetime import datetime
from platform import python_version
import logging
import requests
from requests.auth import HTTPBasicAuth
@ -40,7 +38,7 @@ from base import (
Logger,
)
logger = Logger(name="server", level=logging.DEBUG)
logger = Logger(name="server")
# initialization
try:
@ -48,13 +46,12 @@ try:
_username, _password, server_url = extract_credentials(
config("SERVER_URL", default="")
)
connection_timeout = config("CONNECTION_TIMEOUT", default=5, cast=int)
server_connection_type = config("SERVER_CONNECTION_TYPE", default="proxy")
ca_key = config("CA_KEY", default="ca.key")
ca_cert = config("CA_CERT", default="ca.crt")
cert_key = config("CERT_KEY", default="cert.key")
cert_dir = config("CERT_DIR", default="certs/")
openssl_bin_path = config("OPENSSL_BINPATH", default=find_openssl_binpath())
cakey = config("CA_KEY", default="ca.key")
cacert = config("CA_CERT", default="ca.crt")
certkey = config("CERT_KEY", default="cert.key")
certdir = config("CERT_DIR", default="certs/")
openssl_binpath = config("OPENSSL_BINPATH", default=find_openssl_binpath())
client_encoding = config("CLIENT_ENCODING", default="utf-8")
local_domain = config("LOCAL_DOMAIN", default="")
proxy_pass = config("PROXY_PASS", default="")
@ -90,7 +87,7 @@ if _username:
auth = HTTPBasicAuth(_username, _password)
def parse_first_data(data: bytes):
def parse_first_data(data):
parsed_data = (b"", b"", b"", b"", b"")
try:
@ -129,33 +126,22 @@ def parse_first_data(data: bytes):
return parsed_data
def conn_string(conn: socket.socket, data: bytes, addr: bytes):
def conn_string(conn, data, addr):
# JSON-RPC 2.0 request
def process_jsonrpc2(_data: bytes):
json_data = json.loads(_data.decode(client_encoding, errors="ignore"))
if json_data["jsonrpc"] == "2.0":
def process_jsonrpc2(data):
jsondata = json.loads(data.decode(client_encoding, errors="ignore"))
if jsondata["jsonrpc"] == "2.0":
jsonrpc2_server(
conn, json_data["id"], json_data["method"], json_data["params"]
conn, jsondata["id"], jsondata["method"], jsondata["params"]
)
return True
return False
# debugging
logger.debug("@ " + ("%s:%s" % addr))
logger.debug("> " + data.hex(' '))
# JSON-RPC 2.0 request over Socket (stateful)
if data.find(b"{") == 0 and process_jsonrpc2(data):
# will be close by the client
return
# Check a preludes in connectors
connector = Extension.test_connectors(data)
if connector:
logger.info("[*] Connecting...")
connector.connect(conn, data, b'', b'', b'', b'', b'')
return
# parse first data (header)
webserver, port, scheme, method, url = parse_first_data(data)
@ -169,63 +155,53 @@ def conn_string(conn: socket.socket, data: bytes, addr: bytes):
return
# if it is reverse proxy
local_domains = list(filter(None, map(str.strip, local_domain.split(','))))
for domain in local_domains:
localserver = domain.encode(client_encoding)
# Resolve a cache mismatch issue when making requests to a local domain.
header_end = data.find(b"\r\n\r\n")
header_section_data = data[:header_end] if header_end > -1 else b''
header_host_pattern = re.compile(rb"\n\s*host\s*:\s*" + re.escape(localserver), re.IGNORECASE)
if webserver == localserver or header_host_pattern.search(header_section_data):
logger.info("[*] Reverse proxy requested: %s" % local_domain)
if local_domain != "":
localserver = local_domain.encode(client_encoding)
if webserver == localserver or data.find(b"\nHost: " + localserver) > -1:
logger.info("[*] Detected the reverse proxy request: %s" % local_domain)
scheme, _webserver, _port = proxy_pass.encode(client_encoding).split(b":")
webserver = _webserver[2:]
port = int(_port.decode(client_encoding))
method = b"CONNECT" if scheme == b"https" else method # proxy pass on HTTPS
break
proxy_server(webserver, port, scheme, method, url, conn, addr, data)
def jsonrpc2_server(
conn: socket.socket, _id: str, method: str, params: dict[str, str | int]
):
def jsonrpc2_server(conn, id, method, params):
if method == "relay_accept":
accepted_relay[_id] = conn
accepted_relay[id] = conn
connection_speed = params["connection_speed"]
logger.info("[*] connection speed: %s milliseconds" % str(connection_speed))
logger.info("[*] connection speed: %s milliseconds" % (str(connection_speed)))
while conn.fileno() > -1:
time.sleep(1)
del accepted_relay[_id]
logger.info("[*] relay destroyed: %s" % _id)
del accepted_relay[id]
logger.info("[*] relay destroyed: %s" % id)
else:
Extension.dispatch_rpcmethod(method, "call", _id, params, conn)
Extension.dispatch_rpcmethod(method, "call", id, params, conn)
# return in conn_string()
def proxy_connect(webserver: bytes, conn: socket.socket):
def proxy_connect(webserver, conn):
hostname = webserver.decode(client_encoding)
cert_path = "%s/%s.crt" % (cert_dir.rstrip("/"), hostname)
certpath = "%s/%s.crt" % (certdir.rstrip("/"), hostname)
if not os.path.exists(cert_dir):
os.makedirs(cert_dir)
if not os.path.exists(certdir):
os.makedirs(certdir)
# https://stackoverflow.com/questions/24055036/handle-https-request-in-proxy-server-by-c-sharp-connect-tunnel
conn.send(b"HTTP/1.1 200 Connection Established\r\n\r\n")
# https://github.com/inaz2/proxy2/blob/master/proxy2.py
try:
if not os.path.isfile(cert_path):
if not os.path.isfile(certpath):
epoch = "%d" % (time.time() * 1000)
p1 = Popen(
[
openssl_bin_path,
openssl_binpath,
"req",
"-new",
"-key",
cert_key,
certkey,
"-subj",
"/CN=%s" % hostname,
],
@ -233,19 +209,19 @@ def proxy_connect(webserver: bytes, conn: socket.socket):
)
p2 = Popen(
[
openssl_bin_path,
openssl_binpath,
"x509",
"-req",
"-days",
"3650",
"-CA",
ca_cert,
cacert,
"-CAkey",
ca_key,
cakey,
"-set_serial",
epoch,
"-out",
cert_path,
certpath,
],
stdin=p1.stdout,
stderr=PIPE,
@ -256,20 +232,20 @@ def proxy_connect(webserver: bytes, conn: socket.socket):
"[*] OpenSSL distribution not found on this system. Skipping certificate issuance.",
exc_info=e,
)
cert_path = "default.crt"
certpath = "default.crt"
except Exception as e:
logger.error("[*] Skipping certificate issuance.", exc_info=e)
cert_path = "default.crt"
logger.info("[*] Certificate file: %s" % cert_path)
logger.info("[*] Private key file: %s" % cert_key)
certpath = "default.crt"
logger.info("[*] Certificate file: %s" % (certpath))
logger.info("[*] Private key file: %s" % (certkey))
# https://stackoverflow.com/questions/11255530/python-simple-ssl-socket-server
# https://docs.python.org/3/library/ssl.html
context = ssl.SSLContext(ssl.PROTOCOL_TLS_SERVER)
context.check_hostname = False
context.verify_mode = ssl.CERT_NONE
context.load_cert_chain(certfile=cert_path, keyfile=cert_key)
context.load_cert_chain(certfile=certpath, keyfile=certkey)
try:
# https://stackoverflow.com/questions/11255530/python-simple-ssl-socket-server
@ -280,14 +256,12 @@ def proxy_connect(webserver: bytes, conn: socket.socket):
"[*] SSL negotiation failed.",
exc_info=e,
)
return conn, b""
return (conn, b"")
return conn, data
return (conn, data)
def proxy_check_filtered(
data: bytes, webserver: bytes, port: bytes, scheme: bytes, method: bytes, url: bytes
):
def proxy_check_filtered(data, webserver, port, scheme, method, url):
filtered = False
filters = Extension.get_filters()
@ -298,16 +272,7 @@ def proxy_check_filtered(
return filtered
def proxy_server(
webserver: bytes,
port: bytes,
scheme: bytes,
method: bytes,
url: bytes,
conn: socket.socket,
addr: bytes,
data: bytes,
):
def proxy_server(webserver, port, scheme, method, url, conn, addr, data):
try:
logger.info("[*] Started the request. %s" % (str(addr[0])))
@ -331,11 +296,14 @@ def proxy_server(
_, _, _, method, url = parse_first_data(data)
# https://stackoverflow.com/questions/44343739/python-sockets-ssl-eof-occurred-in-violation-of-protocol
def sock_close(_sock: socket.socket):
_sock.close()
def sock_close(sock, is_ssl=False):
# if is_ssl:
# sock = sock.unwrap()
# sock.shutdown(socket.SHUT_RDWR)
sock.close()
# Wait to see if there is more data to transmit
def sendall(_sock: socket.socket, _conn: socket.socket, _data: bytes):
def sendall(sock, conn, data):
# send first chuck
if proxy_check_filtered(data, webserver, port, scheme, method, url):
sock.close()
@ -346,7 +314,7 @@ def proxy_server(
# send following chunks
buffered = b""
conn.settimeout(connection_timeout)
conn.settimeout(1)
while True:
try:
chunk = conn.recv(buffer_size)
@ -356,7 +324,7 @@ def proxy_server(
if proxy_check_filtered(
buffered, webserver, port, scheme, method, url
):
sock_close(sock)
sock_close(sock, is_ssl)
raise Exception("Filtered request")
sock.send(chunk)
if len(buffered) > buffer_size * 2:
@ -386,7 +354,7 @@ def proxy_server(
i = 0
is_http_403 = False
_buffered = b""
buffered = b""
while True:
chunk = sock.recv(buffer_size)
if not chunk:
@ -394,26 +362,24 @@ def proxy_server(
if i == 0 and chunk.find(b"HTTP/1.1 403") == 0:
is_http_403 = True
break
_buffered += chunk
if proxy_check_filtered(
_buffered, webserver, port, scheme, method, url
):
sock_close(sock)
buffered += chunk
if proxy_check_filtered(buffered, webserver, port, scheme, method, url):
sock_close(sock, is_ssl)
add_filtered_host(webserver.decode(client_encoding), "127.0.0.1")
raise Exception("Filtered response")
conn.send(chunk)
if len(_buffered) > buffer_size * 2:
_buffered = _buffered[-buffer_size * 2 :]
if len(buffered) > buffer_size * 2:
buffered = buffered[-buffer_size * 2 :]
i += 1
# when blocked
if is_http_403:
logger.warning(
"[*] Blocked the request by remote server: %s"
% webserver.decode(client_encoding)
% (webserver.decode(client_encoding))
)
def bypass_callback(response: requests.Response):
def bypass_callback(response, *args, **kwargs):
if response.status_code != 200:
conn.sendall(b'HTTP/1.1 403 Forbidden\r\n\r\n{"status":403}')
return
@ -454,7 +420,7 @@ def proxy_server(
else:
conn.sendall(b'HTTP/1.1 403 Forbidden\r\n\r\n{"status":403}')
sock_close(sock)
sock_close(sock, is_ssl)
logger.info(
"[*] Received %s chunks. (%s bytes per chunk)"
@ -463,8 +429,6 @@ def proxy_server(
# stateful mode
elif server_connection_type == "stateful":
client_address = str(addr[0])
proxy_data = {
"headers": {
"User-Agent": "php-httpproxy/0.1.5 (Client; Python "
@ -473,7 +437,7 @@ def proxy_server(
},
"data": {
"buffer_size": str(buffer_size),
"client_address": client_address,
"client_address": str(addr[0]),
"client_port": str(listening_port),
"client_encoding": client_encoding,
"remote_address": webserver.decode(client_encoding),
@ -484,7 +448,7 @@ def proxy_server(
}
# get client address
logger.info("[*] Resolving the client address...")
logger.info("[*] resolving the client address...")
while len(resolved_address_list) == 0:
try:
_, query_data = jsonrpc2_encode("get_client_address")
@ -497,23 +461,11 @@ def proxy_server(
)
if query.status_code == 200:
result = query.json()["result"]
if isinstance(result["data"], str):
client_address = result["data"]
resolved_address_list.append(client_address)
elif isinstance(result["data"], list):
client_address = result["data"][0]
resolved_address_list.append(client_address)
else:
logger.warn("[*] Failed to resolve a client address. Retrying...")
else:
logger.warn("[*] Failed to resolve a client address. Retrying...")
resolved_address_list.append(result["data"])
logger.info("[*] resolved IP: %s" % (result["data"]))
except requests.exceptions.ReadTimeout:
logger.warn("[*] Failed to resolve a client address. Retrying...")
# update the client address
logger.info("[*] Use the client address: %s" % (client_address))
proxy_data["data"]["client_address"] = client_address
pass
proxy_data["data"]["client_address"] = resolved_address_list[0]
# build a tunnel
def relay_connect(id, raw_data, proxy_data):
@ -557,27 +509,27 @@ def proxy_server(
else:
resolved_address_list.remove(resolved_address_list[0])
logger.info("[*] the relay is gone. %s" % id)
sock_close(sock)
sock_close(sock, is_ssl)
return
# get response
i = 0
buffered = b""
while True:
_chunk = sock.recv(buffer_size)
if not _chunk:
chunk = sock.recv(buffer_size)
if not chunk:
break
buffered += _chunk
buffered += chunk
if proxy_check_filtered(buffered, webserver, port, scheme, method, url):
sock_close(sock)
sock_close(sock, is_ssl)
add_filtered_host(webserver.decode(client_encoding), "127.0.0.1")
raise Exception("Filtered response")
conn.send(_chunk)
conn.send(chunk)
if len(buffered) > buffer_size * 2:
buffered = buffered[-buffer_size * 2 :]
i += 1
sock_close(sock)
sock_close(sock, is_ssl)
logger.info(
"[*] Received %s chunks. (%s bytes per chunk)"
@ -640,19 +592,19 @@ def proxy_server(
logger.info("[*] Connecting...")
connector.connect(conn, data, webserver, port, scheme, method, url)
else:
raise Exception("[*] The request from " + ("%s:%s" % addr) + " is ignored due to an undefined connector type.")
raise Exception("Unsupported connection type")
logger.info("[*] Request and received. Done. %s" % (str(addr[0])))
conn.close()
except Exception as e:
print(traceback.format_exc())
logger.warning("[*] Ignored the request.", exc_info=e)
logger.error("[*] Exception on requesting the data.", exc_info=e)
conn.sendall(b'HTTP/1.1 403 Forbidden\r\n\r\n{"status":403}')
conn.close()
# journaling a filtered hosts
def add_filtered_host(domain: str, ip_address: str):
def add_filtered_host(domain, ip_address):
hosts_path = "./filtered.hosts"
with open(hosts_path, "r") as file:
lines = file.readlines()
@ -667,37 +619,24 @@ def add_filtered_host(domain: str, ip_address: str):
def start(): # Main Program
try:
sock = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
sock.setsockopt(socket.SOL_SOCKET, socket.SO_REUSEADDR, 1)
sock.bind(("", listening_port))
sock.listen(max_connection)
logger.warning("[*] Server started successfully [ %d ]" % listening_port)
logger.info("[*] Server started successfully [ %d ]" % listening_port)
except Exception as e:
logger.error("[*] Unable to Initialize Socket", exc_info=e)
sys.exit(2)
def recv(conn):
conn.settimeout(connection_timeout)
try:
data = conn.recv(buffer_size)
if not data:
data = b''
except socket.timeout:
logger.warning(f"No data received from " + ("%s:%s" % addr) + ". Attempting to request data.")
data = b''
return data
while True:
try:
conn, addr = sock.accept() # Accept connection from client browser
data = recv(conn) # Recieve client data
data = conn.recv(buffer_size) # Recieve client data
start_new_thread(conn_string, (conn, data, addr)) # Starting a thread
except KeyboardInterrupt:
sock.close()
logger.info("[*] Graceful Shutdown")
sys.exit(1)
if __name__ == "__main__":
# Fix Value error
if use_extensions:

30
smtp.py
View File

@ -20,7 +20,7 @@ from requests.auth import HTTPBasicAuth
from base import (
extract_credentials,
jsonrpc2_encode,
Logger, jsonrpc2_decode,
Logger,
)
logger = Logger(name="smtp")
@ -40,22 +40,21 @@ auth = None
if _username:
auth = HTTPBasicAuth(_username, _password)
class CaterpillarSMTPHandler:
def __init__(self):
self.smtpd_hostname = "CaterpillarSMTPServer"
self.smtp_version = "0.1.6"
async def handle_DATA(self, server, session, envelope):
mail_from = envelope.mail_from
rcpt_tos = envelope.rcpt_tos
mailfrom = envelope.mail_from
rcpttos = envelope.rcpt_tos
data = envelope.content
message = EmailMessage()
message.set_content(data)
subject = message.get("Subject", "")
to = message.get("To", "")
subject = message.get('Subject', '')
to = message.get('To', '')
proxy_data = {
"headers": {
@ -65,7 +64,7 @@ class CaterpillarSMTPHandler:
},
"data": {
"to": to,
"from": mail_from,
"from": mailfrom,
"subject": subject,
"message": data.decode("utf-8"),
},
@ -76,23 +75,23 @@ class CaterpillarSMTPHandler:
response = await asyncio.to_thread(
requests.post,
server_url,
headers=proxy_data["headers"],
headers=proxy_data['headers'],
data=raw_data,
auth=auth,
auth=auth
)
if response.status_code == 200:
_type, _id, rpc_data = jsonrpc2_decode(response.text)
if rpc_data["success"]:
type, id, rpcdata = jsonrpc2_decode(response.text)
if rpcdata['success']:
logger.info("[*] Email sent successfully.")
else:
raise Exception(f"({rpc_data['code']}) {rpc_data['message']}")
raise Exception(f"({rpcdata['code']}) {rpcdata['message']}")
else:
raise Exception(f"Status {response.status_code}")
except Exception as e:
logger.error("[*] Failed to send email", exc_info=e)
return "500 Could not process your message. " + str(e)
return '500 Could not process your message. ' + str(e)
return "250 OK"
return '250 OK'
# https://aiosmtpd-pepoluan.readthedocs.io/en/latest/migrating.html
@ -102,12 +101,11 @@ def main():
# Run the event loop in a separate thread.
controller.start()
# Wait for the user to press Return.
input("SMTP server running. Press Return to stop server and exit.")
input('SMTP server running. Press Return to stop server and exit.')
controller.stop()
logger.warning("[*] User has requested an interrupt")
logger.warning("[*] Application Exiting.....")
sys.exit()
if __name__ == "__main__":
main()

27
web.py
View File

@ -7,20 +7,18 @@
# Namyheon Go (Catswords Research) <gnh1201@gmail.com>
# https://github.com/gnh1201/caterpillar
# Created at: 2024-05-20
# Updated at: 2024-10-25
# Updated at: 2024-07-10
#
import os
import sys
from decouple import config
from flask import Flask, request, render_template
from flask_cors import CORS
from base import Extension, jsonrpc2_error_encode, Logger
# TODO: 나중에 Flask 커스텀 핸들러 구현 해야 함
logger = Logger(name="web")
app = Flask(__name__)
CORS(app)
app.config["UPLOAD_FOLDER"] = "data/"
if not os.path.exists(app.config["UPLOAD_FOLDER"]):
os.makedirs(app.config["UPLOAD_FOLDER"])
@ -51,25 +49,18 @@ def process_jsonrpc2():
conn = Connection(request)
# JSON-RPC 2.0 request
json_data = request.get_json(silent=True)
if json_data["jsonrpc"] == "2.0":
result = Extension.dispatch_rpcmethod(
json_data["method"], "call", json_data["id"], json_data["params"], conn)
return {
"jsonrpc": "2.0",
"result": {
"data": result
},
"id": None
}
jsondata = request.get_json(silent=True)
if jsondata["jsonrpc"] == "2.0":
return Extension.dispatch_rpcmethod(
jsondata["method"], "call", jsondata["id"], jsondata["params"], conn
)
# when error
return jsonrpc2_error_encode({"message": "Not valid JSON-RPC 2.0 request"})
return jsonrpc2_error_encode({"message": "Not vaild JSON-RPC 2.0 request"})
def jsonrpc2_server(conn, _id, method, params):
return Extension.dispatch_rpcmethod(method, "call", _id, params, conn)
def jsonrpc2_server(conn, id, method, params):
return Extension.dispatch_rpcmethod(method, "call", id, params, conn)
class Connection: