Compare commits

...

11 Commits

Author SHA1 Message Date
Jonny Saunders
f6daa69755
Merge 22485b9422 into fbe9728f36 2025-05-06 15:05:48 +00:00
Claire
fbe9728f36
Bump version to v4.3.8 (#34626)
Some checks are pending
Check i18n / check-i18n (push) Waiting to run
CodeQL / Analyze (javascript) (push) Waiting to run
CodeQL / Analyze (ruby) (push) Waiting to run
Check formatting / lint (push) Waiting to run
JavaScript Linting / lint (push) Waiting to run
Ruby Linting / lint (push) Waiting to run
JavaScript Testing / test (push) Waiting to run
Historical data migration test / test (14-alpine) (push) Waiting to run
Historical data migration test / test (15-alpine) (push) Waiting to run
Historical data migration test / test (16-alpine) (push) Waiting to run
Historical data migration test / test (17-alpine) (push) Waiting to run
Ruby Testing / build (production) (push) Waiting to run
Ruby Testing / build (test) (push) Waiting to run
Ruby Testing / test (.ruby-version) (push) Blocked by required conditions
Ruby Testing / test (3.2) (push) Blocked by required conditions
Ruby Testing / test (3.3) (push) Blocked by required conditions
Ruby Testing / Libvips tests (.ruby-version) (push) Blocked by required conditions
Ruby Testing / Libvips tests (3.2) (push) Blocked by required conditions
Ruby Testing / Libvips tests (3.3) (push) Blocked by required conditions
Ruby Testing / End to End testing (.ruby-version) (push) Blocked by required conditions
Ruby Testing / End to End testing (3.2) (push) Blocked by required conditions
Ruby Testing / End to End testing (3.3) (push) Blocked by required conditions
Ruby Testing / Elastic Search integration testing (.ruby-version, docker.elastic.co/elasticsearch/elasticsearch:7.17.13) (push) Blocked by required conditions
Ruby Testing / Elastic Search integration testing (.ruby-version, docker.elastic.co/elasticsearch/elasticsearch:8.10.2) (push) Blocked by required conditions
Ruby Testing / Elastic Search integration testing (.ruby-version, opensearchproject/opensearch:2) (push) Blocked by required conditions
Ruby Testing / Elastic Search integration testing (3.2, docker.elastic.co/elasticsearch/elasticsearch:7.17.13) (push) Blocked by required conditions
Ruby Testing / Elastic Search integration testing (3.3, docker.elastic.co/elasticsearch/elasticsearch:7.17.13) (push) Blocked by required conditions
2025-05-06 14:17:07 +00:00
Claire
3bbf3e9709
Fix code style issue (#34624) 2025-05-06 13:35:54 +00:00
Claire
79931bf3ae
Merge commit from fork
* Check scheme in account and post links

* Harden media attachments

* Client-side mitigation

* Client-side mitigation for media attachments
2025-05-06 15:02:13 +02:00
sneakers-the-rat
22485b9422
class attr access 2025-05-05 13:44:41 -07:00
sneakers-the-rat
f9600fcbd3
bump limits 2025-05-05 13:44:41 -07:00
sneakers-the-rat
849c0500b0
filter for only public and unlisted posts 2025-05-05 13:44:41 -07:00
sneakers-the-rat
e489b9e34d
port account backfill implementation from #32634 2025-05-05 13:44:41 -07:00
sneakers-the-rat
e5e83ebaed
apply changes from #34610 2025-05-05 13:39:49 -07:00
sneakers-the-rat
48ed1a38a1
oh right ruby uses double && for logical and 2025-05-05 13:36:49 -07:00
sneakers-the-rat
ccb2f9a210
consolidate collection handling in jsonld helper 2025-05-05 13:36:46 -07:00
21 changed files with 354 additions and 123 deletions

View File

@ -110,3 +110,17 @@ FETCH_REPLIES_MAX_SINGLE=500
# Max number of replies Collection pages to fetch - total
FETCH_REPLIES_MAX_PAGES=500
# Account Backfill Behavior
# --------------------------
# When the first person from your instance follows a remote account,
# backfill their most recent n statuses.
# (default: true if unset, set explicitly to ``false`` to disable)
ACCOUNT_BACKFILL_ENABLED=true
# Max statuses to fetch when backfilling a new account
ACCOUNT_BACKFILL_MAX_STATUSES=1000
# Max number of replies Collection pages to fetch
ACCOUNT_BACKFILL_MAX_PAGES=200

View File

@ -2,9 +2,34 @@
All notable changes to this project will be documented in this file.
## [4.3.8] - 2025-05-06
### Security
- Update dependencies
- Check scheme on account, profile, and media URLs ([GHSA-x2rc-v5wx-g3m5](https://github.com/mastodon/mastodon/security/advisories/GHSA-x2rc-v5wx-g3m5))
### Added
- Add warning for REDIS_NAMESPACE deprecation at startup (#34581 by @ClearlyClaire)
- Add built-in context for interaction policies (#34574 by @ClearlyClaire)
### Changed
- Change activity distribution error handling to skip retrying for deleted accounts (#33617 by @ClearlyClaire)
### Removed
- Remove double-query for signed query strings (#34610 by @ClearlyClaire)
### Fixed
- Fix incorrect redirect in response to unauthenticated API requests in limited federation mode (#34549 by @ClearlyClaire)
- Fix sign-up e-mail confirmation page reloading on error or redirect (#34548 by @ClearlyClaire)
## [4.3.7] - 2025-04-02
### Add
### Added
- Add delay to profile updates to debounce them (#34137 by @ClearlyClaire)
- Add support for paginating partial collections in `SynchronizeFollowersService` (#34272 and #34277 by @ClearlyClaire)

View File

@ -215,6 +215,72 @@ module JsonLdHelper
end
end
# Iterate through the pages of an activitypub collection,
# returning the collected items and the number of pages that were fetched.
#
# @param collection_or_uri [String, Hash]
# either the URI or an already-fetched AP object
# @param max_pages [Integer, nil]
# Max pages to fetch, if nil, fetch until no more pages
# @param max_items [Integer, nil]
# Max items to fetch, if nil, fetch until no more items
# @param reference_uri [String, nil]
# If not nil, a URI to compare to the collection URI.
# If the host of the collection URI does not match the reference URI,
# do not fetch the collection page.
# @param on_behalf_of [Account, nil]
# Sign the request on behalf of the Account, if not nil
# @return [Array<Array<Hash>, Integer>, nil]
# The collection items and the number of pages fetched
def collection_items(collection_or_uri, max_pages: 1, max_items: nil, reference_uri: nil, on_behalf_of: nil)
collection = fetch_collection(collection_or_uri, reference_uri: reference_uri, on_behalf_of: on_behalf_of)
return unless collection.is_a?(Hash)
collection = fetch_collection(collection['first'], reference_uri: reference_uri, on_behalf_of: on_behalf_of) if collection['first'].present?
return unless collection.is_a?(Hash)
items = []
n_pages = 1
while collection.is_a?(Hash)
items.concat(as_array(collection_page_items(collection)))
break if !max_items.nil? && items.size >= max_items
break if !max_pages.nil? && n_pages >= max_pages
collection = collection['next'].present? ? fetch_collection(collection['next'], reference_uri: reference_uri, on_behalf_of: on_behalf_of) : nil
n_pages += 1
end
[items, n_pages]
end
def collection_page_items(collection)
case collection['type']
when 'Collection', 'CollectionPage'
collection['items']
when 'OrderedCollection', 'OrderedCollectionPage'
collection['orderedItems']
end
end
# Fetch a single collection page
# To get the whole collection, use collection_items
#
# @param collection_or_uri [String, Hash]
# @param reference_uri [String, nil]
# If not nil, a URI to compare to the collection URI.
# If the host of the collection URI does not match the reference URI,
# do not fetch the collection page.
# @param on_behalf_of [Account, nil]
# Sign the request on behalf of the Account, if not nil
# @return [Hash, nil]
def fetch_collection(collection_or_uri, reference_uri: nil, on_behalf_of: nil)
return collection_or_uri if collection_or_uri.is_a?(Hash)
return if !reference_uri.nil? && non_matching_uri_hosts?(reference_uri, collection_or_uri)
fetch_resource_without_id_validation(collection_or_uri, on_behalf_of, raise_on_error: :temporary)
end
def valid_activitypub_content_type?(response)
return true if response.mime_type == 'application/activity+json'

View File

@ -77,6 +77,17 @@ export function normalizeStatus(status, normalOldStatus) {
normalStatus.contentHtml = emojify(normalStatus.content, emojiMap);
normalStatus.spoilerHtml = emojify(escapeTextContentForBrowser(spoilerText), emojiMap);
normalStatus.hidden = expandSpoilers ? false : spoilerText.length > 0 || normalStatus.sensitive;
if (normalStatus.url && !(normalStatus.url.startsWith('http://') || normalStatus.url.startsWith('https://'))) {
normalStatus.url = null;
}
normalStatus.url ||= normalStatus.uri;
normalStatus.media_attachments.forEach(item => {
if (item.remote_url && !(item.remote_url.startsWith('http://') || item.remote_url.startsWith('https://')))
item.remote_url = null;
});
}
if (normalOldStatus) {

View File

@ -144,5 +144,10 @@ export function createAccountFromServerJSON(serverJSON: ApiAccountJSON) {
),
note_emojified: emojify(accountJSON.note, emojiMap),
note_plain: unescapeHTML(accountJSON.note),
url:
accountJSON.url.startsWith('http://') ||
accountJSON.url.startsWith('https://')
? accountJSON.url
: accountJSON.uri,
});
}

View File

@ -15,13 +15,15 @@ class ActivityPub::Parser::MediaAttachmentParser
end
def remote_url
Addressable::URI.parse(@json['url'])&.normalize&.to_s
url = Addressable::URI.parse(@json['url'])&.normalize&.to_s
url unless unsupported_uri_scheme?(url)
rescue Addressable::URI::InvalidURIError
nil
end
def thumbnail_remote_url
Addressable::URI.parse(@json['icon'].is_a?(Hash) ? @json['icon']['url'] : @json['icon'])&.normalize&.to_s
url = Addressable::URI.parse(@json['icon'].is_a?(Hash) ? @json['icon']['url'] : @json['icon'])&.normalize&.to_s
url unless unsupported_uri_scheme?(url)
rescue Addressable::URI::InvalidURIError
nil
end

View File

@ -29,7 +29,10 @@ class ActivityPub::Parser::StatusParser
end
def url
url_to_href(@object['url'], 'text/html') if @object['url'].present?
return if @object['url'].blank?
url = url_to_href(@object['url'], 'text/html')
url unless unsupported_uri_scheme?(url)
end
def text

View File

@ -4,6 +4,7 @@ require 'singleton'
class ActivityPub::TagManager
include Singleton
include JsonLdHelper
include RoutingHelper
CONTEXT = 'https://www.w3.org/ns/activitystreams'
@ -17,7 +18,7 @@ class ActivityPub::TagManager
end
def url_for(target)
return target.url if target.respond_to?(:local?) && !target.local?
return unsupported_uri_scheme?(target.url) ? nil : target.url if target.respond_to?(:local?) && !target.local?
return unless target.respond_to?(:object_type)

View File

@ -32,6 +32,7 @@ class FollowRequest < ApplicationRecord
validates :languages, language: true
def authorize!
is_first_follow = first_follow?
follow = account.follow!(target_account, reblogs: show_reblogs, notify: notify, languages: languages, uri: uri, bypass_limit: true)
if account.local?
@ -40,6 +41,7 @@ class FollowRequest < ApplicationRecord
MergeWorker.push_bulk(List.where(account: account).joins(:list_accounts).where(list_accounts: { account_id: target_account.id }).pluck(:id)) do |list_id|
[target_account.id, list_id, 'list']
end
ActivityPub::AccountBackfillWorker.perform_async(target_account.id) if is_first_follow & ActivityPub::AccountBackfillService::ENABLED
end
destroy!
@ -51,6 +53,10 @@ class FollowRequest < ApplicationRecord
false # Force uri_for to use uri attribute
end
def first_follow?
!target_account.followers.local.exists?
end
before_validation :set_uri, only: :create
after_commit :invalidate_follow_recommendations_cache

View File

@ -0,0 +1,53 @@
# frozen_string_literal: true
class ActivityPub::AccountBackfillService < BaseService
include JsonLdHelper
ENABLED = ENV['ACCOUNT_BACKFILL_ENABLED'].nil? || ENV['ACCOUNT_BACKFILL_ENABLED'] == 'true'
MAX_STATUSES = (ENV['ACCOUNT_BACKFILL_MAX_STATUSES'] || 1000).to_i
MAX_PAGES = (ENV['ACCOUNT_BACKFILL_MAX_PAGES'] || 200).to_i
def call(account, on_behalf_of: nil, request_id: nil)
return unless ENABLED
@account = account
return if @account.nil? || @account.outbox_url.nil?
@items, = collection_items(@account.outbox_url, max_items: MAX_STATUSES, max_pages: MAX_PAGES, on_behalf_of: on_behalf_of)
@items = filter_items(@items)
return if @items.nil?
on_behalf_of_id = on_behalf_of&.id
FetchReplyWorker.push_bulk(@items) do |status_uri_or_body|
if status_uri_or_body.is_a?(Hash) && status_uri_or_body.key?('object') && status_uri_or_body.key?('id')
# Re-add the minimally-acceptable @context, which gets stripped because this object comes inside a collection
status_uri_or_body['@context'] = ActivityPub::TagManager::CONTEXT unless status_uri_or_body.key?('@context')
[status_uri_or_body['id'], { prefetched_body: status_uri_or_body, request_id: request_id, on_behalf_of: on_behalf_of_id }]
else
[status_uri_or_body, { request_id: request_id, on_behalf_of: on_behalf_of_id }]
end
end
@items
end
private
# Reject any non-public statuses.
# Since our request may have been signed on behalf of the follower,
# we may have received followers-only statuses.
#
# Formally, a followers-only status is addressed to the account's followers collection.
# We were not in that collection at the time that the post was made,
# so followers-only statuses fetched by backfilling are not addressed to us.
# Public and unlisted statuses are send to the activitystreams "Public" entity.
# We are part of the public, so those posts *are* addressed to us.
#
# @param items [Array<Hash>]
# @return [Array<Hash>]
def filter_items(items)
allowed = [:public, :unlisted]
items.filter { |item| item.is_a?(String) || allowed.include?(ActivityPub::Parser::StatusParser.new(item).visibility) }
end
end

View File

@ -12,30 +12,12 @@ class ActivityPub::FetchFeaturedCollectionService < BaseService
return unless supported_context?(@json)
process_items(collection_items(@json))
@items, = collection_items(@json, max_pages: 1, reference_uri: @account.uri, on_behalf_of: local_follower)
process_items(@items)
end
private
def collection_items(collection)
collection = fetch_collection(collection['first']) if collection['first'].present?
return unless collection.is_a?(Hash)
case collection['type']
when 'Collection', 'CollectionPage'
as_array(collection['items'])
when 'OrderedCollection', 'OrderedCollectionPage'
as_array(collection['orderedItems'])
end
end
def fetch_collection(collection_or_uri)
return collection_or_uri if collection_or_uri.is_a?(Hash)
return if non_matching_uri_hosts?(@account.uri, collection_or_uri)
fetch_resource_without_id_validation(collection_or_uri, local_follower, raise_on_error: :temporary)
end
def process_items(items)
return if items.nil?

View File

@ -11,43 +11,12 @@ class ActivityPub::FetchFeaturedTagsCollectionService < BaseService
return unless supported_context?(@json)
process_items(collection_items(@json))
@items, = collection_items(@json, max_items: FeaturedTag::LIMIT, reference_uri: @account.uri, on_behalf_of: local_follower)
process_items(@items)
end
private
def collection_items(collection)
all_items = []
collection = fetch_collection(collection['first']) if collection['first'].present?
while collection.is_a?(Hash)
items = case collection['type']
when 'Collection', 'CollectionPage'
collection['items']
when 'OrderedCollection', 'OrderedCollectionPage'
collection['orderedItems']
end
break if items.blank?
all_items.concat(items)
break if all_items.size >= FeaturedTag::LIMIT
collection = collection['next'].present? ? fetch_collection(collection['next']) : nil
end
all_items
end
def fetch_collection(collection_or_uri)
return collection_or_uri if collection_or_uri.is_a?(Hash)
return if non_matching_uri_hosts?(@account.uri, collection_or_uri)
fetch_resource_without_id_validation(collection_or_uri, local_follower, raise_on_error: :temporary)
end
def process_items(items)
names = items.filter_map { |item| item['type'] == 'Hashtag' && item['name']&.delete_prefix('#') }.take(FeaturedTag::LIMIT)
tags = names.index_by { |name| HashtagNormalizer.new.normalize(name) }

View File

@ -11,6 +11,9 @@ class ActivityPub::FetchRemoteStatusService < BaseService
def call(uri, prefetched_body: nil, on_behalf_of: nil, expected_actor_uri: nil, request_id: nil)
return if domain_not_allowed?(uri)
# load the account if given as an ID
on_behalf_of = Account.find(on_behalf_of) unless on_behalf_of.nil? || on_behalf_of.is_a?(Account)
@request_id = request_id || "#{Time.now.utc.to_i}-status-#{uri}"
@json = if prefetched_body.nil?
fetch_status(uri, true, on_behalf_of)

View File

@ -8,9 +8,13 @@ class ActivityPub::FetchRepliesService < BaseService
def call(reference_uri, collection_or_uri, max_pages: 1, allow_synchronous_requests: true, request_id: nil)
@reference_uri = reference_uri
@allow_synchronous_requests = allow_synchronous_requests
return if !allow_synchronous_requests && !collection_or_uri.is_a?(Hash)
@items, n_pages = collection_items(collection_or_uri, max_pages: max_pages)
# if given a prefetched collection while forbidding synchronous requests,
# process it and return without fetching additional pages
max_pages = 1 if !allow_synchronous_requests && collection_or_uri.is_a?(Hash)
@items, n_pages = collection_items(collection_or_uri, max_pages: max_pages, max_items: MAX_REPLIES, reference_uri: @reference_uri)
return if @items.nil?
@items = filter_replies(@items)
@ -21,45 +25,6 @@ class ActivityPub::FetchRepliesService < BaseService
private
def collection_items(collection_or_uri, max_pages: 1)
collection = fetch_collection(collection_or_uri)
return unless collection.is_a?(Hash)
collection = fetch_collection(collection['first']) if collection['first'].present?
return unless collection.is_a?(Hash)
items = []
n_pages = 1
while collection.is_a?(Hash)
items.concat(as_array(collection_page_items(collection)))
break if items.size >= MAX_REPLIES
break if n_pages >= max_pages
collection = collection['next'].present? ? fetch_collection(collection['next']) : nil
n_pages += 1
end
[items, n_pages]
end
def collection_page_items(collection)
case collection['type']
when 'Collection', 'CollectionPage'
collection['items']
when 'OrderedCollection', 'OrderedCollectionPage'
collection['orderedItems']
end
end
def fetch_collection(collection_or_uri)
return collection_or_uri if collection_or_uri.is_a?(Hash)
return unless @allow_synchronous_requests
return if non_matching_uri_hosts?(@reference_uri, collection_or_uri)
fetch_resource_without_id_validation(collection_or_uri, nil, raise_on_error: :temporary)
end
def filter_replies(items)
# Only fetch replies to the same server as the original status to avoid
# amplification attacks.

View File

@ -63,10 +63,10 @@ class ActivityPub::SynchronizeFollowersService < BaseService
# Only returns true if the whole collection has been processed
def process_collection!(collection_uri, max_pages: MAX_COLLECTION_PAGES)
collection = fetch_collection(collection_uri)
collection = fetch_collection(collection_uri, reference_uri: @account.uri)
return false unless collection.is_a?(Hash)
collection = fetch_collection(collection['first']) if collection['first'].present?
collection = fetch_collection(collection['first'], reference_uri: @account.uri) if collection['first'].present?
while collection.is_a?(Hash)
process_page!(as_array(collection_page_items(collection)))
@ -81,20 +81,4 @@ class ActivityPub::SynchronizeFollowersService < BaseService
false
end
def collection_page_items(collection)
case collection['type']
when 'Collection', 'CollectionPage'
collection['items']
when 'OrderedCollection', 'OrderedCollectionPage'
collection['orderedItems']
end
end
def fetch_collection(collection_or_uri)
return collection_or_uri if collection_or_uri.is_a?(Hash)
return if non_matching_uri_hosts?(@account.uri, collection_or_uri)
fetch_resource_without_id_validation(collection_or_uri, nil, raise_on_error: :temporary)
end
end

View File

@ -0,0 +1,13 @@
# frozen_string_literal: true
class ActivityPub::AccountBackfillWorker
include Sidekiq::Worker
include ExponentialBackoff
def perform(account_id, options = {})
account = Account.find(account_id)
return if account.local?
ActivityPub::AccountBackfillService.new.call(account, **options.deep_symbolize_keys)
end
end

View File

@ -7,6 +7,6 @@ class FetchReplyWorker
sidekiq_options queue: 'pull', retry: 3
def perform(child_url, options = {})
FetchRemoteStatusService.new.call(child_url, **options.deep_symbolize_keys)
FetchRemoteStatusService.new.call(child_url, **options.symbolize_keys)
end
end

View File

@ -59,7 +59,7 @@ services:
web:
# You can uncomment the following line if you want to not use the prebuilt image, for example if you have local code changes
# build: .
image: ghcr.io/mastodon/mastodon:v4.3.7
image: ghcr.io/mastodon/mastodon:v4.3.8
restart: always
env_file: .env.production
command: bundle exec puma -C config/puma.rb
@ -83,7 +83,7 @@ services:
# build:
# dockerfile: ./streaming/Dockerfile
# context: .
image: ghcr.io/mastodon/mastodon-streaming:v4.3.7
image: ghcr.io/mastodon/mastodon-streaming:v4.3.8
restart: always
env_file: .env.production
command: node ./streaming/index.js
@ -102,7 +102,7 @@ services:
sidekiq:
# You can uncomment the following line if you want to not use the prebuilt image, for example if you have local code changes
# build: .
image: ghcr.io/mastodon/mastodon:v4.3.7
image: ghcr.io/mastodon/mastodon:v4.3.8
restart: always
env_file: .env.production
command: bundle exec sidekiq

View File

@ -17,7 +17,7 @@ module Mastodon
end
def default_prerelease
'alpha.4'
'alpha.5'
end
def prerelease

View File

@ -20,17 +20,19 @@ RSpec.describe FollowRequest do
end
end
it 'calls Account#follow!, MergeWorker.perform_async, and #destroy!' do
it 'calls Account#follow!, MergeWorker.perform_async, ActivityPub::AccountBackfillWorker, and #destroy!' do
allow(account).to receive(:follow!) do
account.active_relationships.create!(target_account: target_account)
end
allow(MergeWorker).to receive(:perform_async)
allow(ActivityPub::AccountBackfillWorker).to receive(:perform_async)
allow(follow_request).to receive(:destroy!)
follow_request.authorize!
expect(account).to have_received(:follow!).with(target_account, reblogs: true, notify: false, uri: follow_request.uri, languages: nil, bypass_limit: true)
expect(MergeWorker).to have_received(:perform_async).with(target_account.id, account.id, 'home')
expect(ActivityPub::AccountBackfillWorker).to have_received(:perform_async).with(target_account.id)
expect(follow_request).to have_received(:destroy!)
end
@ -47,6 +49,21 @@ RSpec.describe FollowRequest do
target = follow_request.target_account
expect(follow_request.account.muting_reblogs?(target)).to be true
end
context 'when subsequent follow requests are made' do
before do
second_account = Fabricate(:account)
second_account.follow!(target_account)
end
it 'doesnt call ActivityPub::AccountBackfillWorker' do
allow(ActivityPub::AccountBackfillWorker).to receive(:perform_async)
follow_request.authorize!
expect(ActivityPub::AccountBackfillWorker).to_not have_received(:perform_async)
end
end
end
describe '#reject!' do

View File

@ -0,0 +1,112 @@
# frozen_string_literal: true
require 'rails_helper'
RSpec.describe ActivityPub::AccountBackfillService do
subject { described_class.new }
before do
stub_const('ActivityPub::AccountBackfillService::ENABLED', true)
end
let!(:account) { Fabricate(:account, domain: 'other.com', outbox_url: 'http://other.com/alice/outbox') }
let!(:outbox) do
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'http://other.com/alice/outbox',
type: 'OrderedCollection',
first: 'http://other.com/alice/outbox?page=true',
}.with_indifferent_access
end
let!(:items) do
[
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'https://other.com/alice/1234',
to: ['https://www.w3.org/ns/activitystreams#Public'],
cc: ['https://other.com/alice/followers'],
type: 'Note',
content: 'Lorem ipsum',
attributedTo: 'http://other.com/alice',
},
'https://other.com/alice/5678',
]
end
let!(:outbox_page) do
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'http://example.com/alice/outbox?page=true',
type: 'OrderedCollectionPage',
orderedItems: items,
}
end
describe '#call' do
before do
stub_request(:get, 'http://other.com/alice/outbox').to_return(status: 200, body: Oj.dump(outbox), headers: { 'Content-Type': 'application/activity+json' })
stub_request(:get, 'http://other.com/alice/outbox?page=true').to_return(status: 200, body: Oj.dump(outbox_page), headers: { 'Content-Type': 'application/activity+json' })
end
it 'fetches the items in the outbox' do
allow(FetchReplyWorker).to receive(:push_bulk)
got_items = subject.call(account)
expect(got_items[0].deep_symbolize_keys).to eq(items[0])
expect(got_items[1]).to eq(items[1])
expect(FetchReplyWorker).to have_received(:push_bulk).with([items[0].stringify_keys, items[1]])
end
context 'with followers-only and private statuses' do
let!(:items) do
[
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'https://other.com/alice/public',
type: 'Note',
to: ['https://www.w3.org/ns/activitystreams#Public'],
cc: ['https://other.com/alice/followers'],
content: 'Lorem ipsum',
attributedTo: 'http://other.com/alice',
},
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'https://other.com/alice/unlisted',
to: ['https://other.com/alice/followers'],
cc: ['https://www.w3.org/ns/activitystreams#Public'],
type: 'Note',
content: 'Lorem ipsum',
attributedTo: 'http://other.com/alice',
},
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'https://other.com/alice/followers-only',
to: ['https://other.com/alice/followers'],
type: 'Note',
content: 'Lorem ipsum',
attributedTo: 'http://other.com/alice',
},
{
'@context': 'https://www.w3.org/ns/activitystreams',
id: 'https://other.com/alice/dm',
to: ['https://other.com/alice/followers'],
type: 'Note',
content: 'Lorem ipsum',
attributedTo: 'http://other.com/alice',
},
]
end
it 'only processes public and unlisted statuses' do
allow(FetchReplyWorker).to receive(:push_bulk)
got_items = subject.call(account)
expect(got_items.length).to eq(2)
expect(got_items[0].deep_symbolize_keys).to eq(items[0])
expect(got_items[1].deep_symbolize_keys).to eq(items[1])
expect(FetchReplyWorker).to have_received(:push_bulk).with([items[0].stringify_keys, items[1].stringify_keys])
end
end
end
end