To clarify, you can always run an amp at a higher ohm load than it is rated, but it will decrease the power by a factor similar to the inverse of the ratio of the rated ohm load and the actual ohm load (assuming the amp wasn't built to support it's rated power at a certain range of resistance).
(rated ohm load/actual ohm load) = (actual power/rated power)
If that makes you more confused, if your amp is rated at 400w at 1ohm but you run it at 1.5, you'll get a lil over 250w out of it (theoretically).