In my previous company, we adopted a micro-service architecture and used Docker to implement it. The average size of our Docker images were ~300MB - ~600MB. However my new company is using Docker mostly for development workflow, and the average image size is ~1.5GB - ~3GB. Some of the larger images (10GB+) are being actively refactored to reduce the image size.
From everything I have read, I feel that these images are too large and we will run into issues down the line, but the rest of the team feels that Docker Engine and Docker Swarm should handle those image sizes without problems.
My question: Is there an accepted ideal range for Docker images, and what pitfalls (if any) will I face trying to use a workflow with GB images?