Setting up a proper client access scheme for BatchIQ was both easier and harder than I expected. Easy, in that AWS publishes some good blog and documentation articles on how to do it. Hard, in that many prospective customers I've talked to are not on the same page with AWS on this.
For the few AWS resources that support resource policies, that can be both the simplest and most convenient method. But this is restricted to S3, SNS, SQS, and a few other services.
The controversy, if you can call it that, is around Role access. Role access is the method recommended, documented, and pushed by AWS. Many customers expected to simply pass a Key/Secret Key pair for an IAM User.
Let's break down some of the pros and cons of each method.
Concerns
Security Maintenance of Creds - who has creds, permission updates, expiring Logging
Cross-Account Access Method
Method Matrix
Anatomy of Sts:AssumeRole
It might help to take a look at the documentation for the Sts:AssumeRole method, which is where roles get turned into temporary credentials. AssumeRole requires a RoleArn and a RoleSessionName.
Advice
I would prefer resource-based policies where they are available, especially for file-type access to S3. Making some files publicly available, or available to certain partners, seems easier and more properly covered by an S3 Bucket Policy. Also, S3 supports its own logging that covers reads.
For most other access, I definitely recommend roles. Once these are set up, it works much better than a bunch of keys, both for you and for service providers.
But there probably still are some cases to give cross-account access with User accounts. Human access to the Management Console is probably at the top of that list, although AWS has defined a path for using roles in this scenario.
Also, I expect AWS to eventually work out an OAuth-style click through flow to grant access to vendors. AWS has been doing this for their own services for a while now, it's clearly the preferred pattern around the interwebs.