0
Technical Briefs

Nonminimum-Phase Phenomenon of PEM Fuel Cell Membrane Humidifiers

[+] Author and Article Information
Dongmei Chen

Department of Mechanical Engineering,  University of Michigan, Ann Arbor, MI 48109-2125

Huei Peng1

Department of Mechanical Engineering,  University of Michigan, Ann Arbor, MI 48109-2125hpeng@umich.edu

Perma Pure is a registered trademark of Perma Pure Inc.

1

Corresponding author.

J. Dyn. Sys., Meas., Control 130(4), 044501 (Jun 04, 2008) (9 pages) doi:10.1115/1.2936381 History: Received March 04, 2005; Revised February 03, 2008; Published June 04, 2008

A membrane-based humidifier that uses cooling water of a fuel cell system to humidify the inlet air is modeled and analyzed in this paper. This four-state lumped model is simple and yet captures the humidification behavior accurately. A peculiar characteristic of this system is the fact that it exhibits nonminimum-phase (NMP) behavior. The reason the NMP behavior exists and the effect of system parameters on the location of the NMP zero are analyzed. A proportional control algorithm is proposed to reject the effect of system disturbances, and a feed-forward algorithm is developed to ensure proper humidifier operation under air flow rate changes. Because the NMP zero exists in the disturbance-to-output loop, the proposed algorithm was found to successfully eliminate the undershoot phenomena associated with the NMP zero. However, the disturbance-to-output loop is coupled with input-to-output loop, and the NMP zero could affect the feedback control design.

Copyright © 2008 by American Society of Mechanical Engineers
Your Session has timed out. Please sign back in to continue.

References

Figures

Grahic Jump Location
Figure 1

One humidifier unit

Grahic Jump Location
Figure 2

Control volumes of one humidifier unit

Grahic Jump Location
Figure 3

System responses under a step increase of the inlet air flow rate (Inlet air temperature, 303°K; inlet air RH, 0; inlet water temperature, 353°K)

Grahic Jump Location
Figure 4

System responses under a step decrease of the inlet air temperature (inlet air flow rate, 0.02785kg∕s; inlet air RH, 0; inlet water temperature, 353°K)

Grahic Jump Location
Figure 5

System responses under a step increase of the inlet air RH (inlet air flow rate, 0.02785kg∕s; inlet air temperature, 303°K; inlet water temperature, 353°K)

Grahic Jump Location
Figure 6

System responses under an inlet air flow rate step increase

Grahic Jump Location
Figure 7

Air outlet RH and RH rate responses under an inlet air flow rate step increase

Grahic Jump Location
Figure 8

Lambda versus gas RH

Grahic Jump Location
Figure 9

Root locus of the four state system with the inlet air flow rate as the system disturbance

Grahic Jump Location
Figure 10

Humidifier test setup: T, temperature; P, pressure; RH, relative humidity

Grahic Jump Location
Figure 11

Dynamic test result versus model prediction (19°C air inlet temperature; 12°C water temperature)

Grahic Jump Location
Figure 12

System block diagram

Grahic Jump Location
Figure 13

Humidifier desired total vapor rate versus inlet air flow rate (based on Ford P2000 fuel cell prototype vehicle at 80°C)

Grahic Jump Location
Figure 14

System responses with and without control under an inlet air flow rate step increase (the dashed line and the dotted line are on top of each other in the right two plots)

Grahic Jump Location
Figure 15

System responses with and without control under an inlet temperature step decrease

Grahic Jump Location
Figure 16

System responses with and without control under an inlet RH step increase

Tables

Errata

Discussions

Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In